All Projects → TropComplique → ShuffleNet-tensorflow

TropComplique / ShuffleNet-tensorflow

Licence: MIT license
A ShuffleNet implementation tested on Tiny ImageNet dataset

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

ShuffleNet

This is an implementation of ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices. It is written in Tensorflow and tested on Tiny ImageNet dataset. The dataset consists of 64x64 images and has 200 classes. The network gives validation accuracy ~49% after 40 epochs (it takes ~2.2 hours on p2.xlarge).

Implementation details

  • I use reduced in size ShuffleNet: in the original paper it has more layers.
    But it is easy to change a couple of parameters in shufflenet/CONSTANTS.py to make it like the original.
  • For the input pipeline I use tf.data.TFRecordDataset.
  • For data augmentation I use 56x56 sized random crops and random color manipulations.
  • I use a reduce-on-plateau learning rate scheduler.

How to use it

Assuming that Tiny ImageNet data is in /home/ubuntu/data/tiny-imagenet-200/ steps are

  • cd ShuffleNet-tensorflow.
  • python tiny_imagenet/move_data.py
    to slightly change the folder structure of the data.
  • python image_dataset_to_tfrecords.py
    to convert the dataset to tfrecords format.
  • (optional) If you want to change the network's length,
    edit the number of ShuffleNet Units in shufflenet/CONSTANTS.py.
  • python train.py
    to begin training. Evaluation is after each epoch.
  • logs and the saved model will be in logs/run0 and saved/run0.

To train on your dataset, you need to change a few values in shufflenet/CONSTANTS.py file.

Requirements

  • Python 3.6
  • tensorflow 1.4
  • tqdm, Pillow, pandas, numpy
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].