All Projects → pjaehrling → finetuneAlexVGG

pjaehrling / finetuneAlexVGG

Licence: Apache-2.0 License
Finetune ConvNets with Tensorflow

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to finetuneAlexVGG

smalldragon
[Work in Progress] Toy Compiler <3
Stars: ✭ 23 (+15%)
Mutual labels:  work-in-progress
consul-registration-hook
Hook that can be used for synchronous registration and deregistration in Consul discovery service on Kubernetes or Mesos cluster with Allegro executor
Stars: ✭ 17 (-15%)
Mutual labels:  work-in-progress
galio-starter-kit
Galio's starter kit is an app you can use to see what you can create with our library
Stars: ✭ 156 (+680%)
Mutual labels:  work-in-progress
reggie
Stata-like Regression Functionality for R
Stars: ✭ 24 (+20%)
Mutual labels:  work-in-progress
vonuvoli-scheme
vonuvoli Scheme -- an R7RS interpreter written in Rust focused on systems programming and scripting (i.e. processes, file-system, etc.) with performance and safety in mind
Stars: ✭ 81 (+305%)
Mutual labels:  work-in-progress
anisthesia
Media detection library
Stars: ✭ 32 (+60%)
Mutual labels:  work-in-progress
cpptoswift
Hack that'll hopefully one day auto-generate a Swift wrapper around C++ source code.
Stars: ✭ 13 (-35%)
Mutual labels:  work-in-progress
nes-emulator
🛠 [WIP] The Nintendo Entertainment System emulator written in Rust
Stars: ✭ 12 (-40%)
Mutual labels:  work-in-progress
sanzo-wada
🎨 Interactive version of Sanzo Wada's - "A Dictionary of Color Combinations"
Stars: ✭ 139 (+595%)
Mutual labels:  work-in-progress
ltjs
LithTech Jupiter System
Stars: ✭ 37 (+85%)
Mutual labels:  work-in-progress
linguistic-datasets-portuguese
Linguistic Datasets for Portuguese: Lista de conjuntos de dados linguísticos para língua portuguesa com licença flexíveis: banco de dados, lista de palavras, sinônimos, antônimos, dicionário temático, tesauro, linked data, semântica, ontologia e representação de conhecimento
Stars: ✭ 46 (+130%)
Mutual labels:  work-in-progress
bound
Data-binding made easy
Stars: ✭ 21 (+5%)
Mutual labels:  work-in-progress
nim.nvim
Nim plugin for NeoVim
Stars: ✭ 159 (+695%)
Mutual labels:  work-in-progress
vscode-xslt-tokenizer
VSCode extension for highlighting XSLT and XPath (upto 3.0/3.1)
Stars: ✭ 37 (+85%)
Mutual labels:  work-in-progress
tangerine
A work-in-progress music player for the Nintendo 3DS and Nintendo Switch
Stars: ✭ 20 (+0%)
Mutual labels:  work-in-progress
craftr
The core framework for the Craftr build system.
Stars: ✭ 1 (-95%)
Mutual labels:  work-in-progress
FSharpWrap
Utility that automatically generates F# modules and functions based on your F# project file's references
Stars: ✭ 14 (-30%)
Mutual labels:  work-in-progress
wip
WIP & naenae: CLI utilities to easily manage Work In Progress with Git
Stars: ✭ 46 (+130%)
Mutual labels:  work-in-progress
spring-boot-jpa
A Spring Boot microservices reference application using Spring Data JPA
Stars: ✭ 25 (+25%)
Mutual labels:  work-in-progress
slingr
A simple CLI for UPnP media file streaming
Stars: ✭ 32 (+60%)
Mutual labels:  work-in-progress

Finetune AlexNet & VGG with Tensorflow

My AlexNet and VGG16 model implementations for Tensorflow, with a validation and finetune/retrain script. Also includes wrapper model classes to use the Tensorflow Slim implementations of VGG16 and Inception V3 (finetune does not really work with those so far). Comes with Jupyter notebooks to test the different preprocessing scripts, run a classification and finetune a model using a notebook.

Requirements

  • Python 2.7 or 3
  • TensorFlow >= 1.13rc0 (I guess everything from version 1.0 on will work)
  • Numpy

Content

  • validate.py: Script to validate the implemented models and the downloaded weights
  • finetune.py: Script to run the finetuning process
  • helper/*: Contains helper scripts/classes to load data and run the retraining
  • models/*: Contains a parent model class and different model implementations (AlexNet, VGG, Inception)
  • images/*: contains 4 example images, used in the validation script
  • preprocessing/*: Contains scripts to run different ways of image preprocessing (crop, resize, ...).

Weights:

Usage

Validate the model implementations, image preprocessing and initial weights

python validate.py -model alex
...
python validate.py -model [alex, vgg, vgg_slim, inc_v3]

Run finetuning/retraining on selected layers

python finetune.py -image_path /path/to/images -model alex
...
python finetune.py -image_path /path/to/images -model [alex, vgg]
python finetune.py -image_file /path/to/images.txt -model [alex, vgg]

Using: -image_dir: /path/to/images should point to a folder with a set of sub-folders, each named after one of your final categories and containing only images from that category. Using: -image_file: /path/to/images.txt should be a file with a list of image-paths and labels.

e.g.

cat /path/to/cat1.jpg
cat /path/to/cat2.jpg
dog /path/to/dog1.jpg
...

Other option:

  • -write_checkpoint_on_each_epoch: Save a checkpint on each epoch (default is just at the end)
python finetune.py ... -write_checkpoint_on_each_epoch
  • -init_from_ckpt /path/to/file.ckpt: Start the training from a saved checkpoint file by providing the path to that file (will restore weights on all layers). Usually the initial weights are the pretrained imagenet weights (numpy-file or checkpoint), without restoring the retrain layers.
python finetune.py ... -init_from_ckpt /path/to/file.ckpt
  • -use_adam_optimizer: Set this to use the AdamOptimizer for training. By default the GradientDescentOptimizer will be used.
python finetune.py ... -use_adam_optimizer

Create Features

You can create features (activations at a given layer) and save them to the filesystem. The featues will be stored as .txt files. The filename is the MD5 hash for the filepath. In addidion a mapping file will be created.

The -image_path/-image_file and -model parameter work the same way as they do for finetuning. In addition you need to provide the layer you want to use by adding -layer (e.g. -layer fc6) and the location the features should be stored with -feature_dir (e.g. -feature_dir /path/to/features)

python create_features.py -image_path /path/to/images -model vgg -layer fc6 -feature_dir /path/to/features
python create_features.py -image_file /path/to/images.txt -model inc_v3 -layer PreLogits -feature_dir /path/to/features
...
python create_features.py -image_path /path/to/images -model [alex, vgg, inc_v3] -layer layername -feature_dir /path/to/features

Useful sources:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].