All Projects → weiaicunzai → Pytorch Cifar100

weiaicunzai / Pytorch Cifar100

Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pytorch Cifar100

Tensorrtx
Implementation of popular deep learning networks with TensorRT network definition API
Stars: ✭ 3,456 (+42.63%)
Mutual labels:  resnet, resnext, squeezenet, inceptionv3, googlenet
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-99.13%)
Mutual labels:  densenet, resnet, inceptionv4, shufflenet
Segmentation models
Segmentation models with pretrained backbones. Keras and TensorFlow Keras.
Stars: ✭ 3,575 (+47.54%)
Mutual labels:  resnet, mobilenet, densenet, resnext
Basic cnns tensorflow2
A tensorflow2 implementation of some basic CNNs(MobileNetV1/V2/V3, EfficientNet, ResNeXt, InceptionV4, InceptionResNetV1/V2, SENet, SqueezeNet, DenseNet, ShuffleNetV2, ResNet).
Stars: ✭ 374 (-84.56%)
Mutual labels:  image-classification, resnet, densenet, resnext
Mmclassification
OpenMMLab Image Classification Toolbox and Benchmark
Stars: ✭ 532 (-78.04%)
Mutual labels:  image-classification, resnet, mobilenet, resnext
Keras Idiomatic Programmer
Books, Presentations, Workshops, Notebook Labs, and Model Zoo for Software Engineers and Data Scientists wanting to learn the TF.Keras Machine Learning framework
Stars: ✭ 720 (-70.28%)
Mutual labels:  resnet, mobilenet, densenet, resnext
cifar-tensorflow
No description or website provided.
Stars: ✭ 18 (-99.26%)
Mutual labels:  image-classification, densenet, resnet, mobilenet
Pytorch classification
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Stars: ✭ 395 (-83.7%)
Mutual labels:  image-classification, resnet, densenet, resnext
awesome-computer-vision-models
A list of popular deep learning models related to classification, segmentation and detection problems
Stars: ✭ 419 (-82.71%)
Mutual labels:  image-classification, densenet, resnet, nasnet
Classification models
Classification models trained on ImageNet. Keras.
Stars: ✭ 938 (-61.29%)
Mutual labels:  resnet, mobilenet, densenet, resnext
bird species classification
Supervised Classification of bird species 🐦 in high resolution images, especially for, Himalayan birds, having diverse species with fairly low amount of labelled data
Stars: ✭ 59 (-97.57%)
Mutual labels:  image-classification, inceptionv3, inception-resnet-v2
pyro-vision
Computer vision library for wildfire detection
Stars: ✭ 33 (-98.64%)
Mutual labels:  image-classification, densenet, resnet
Awesome Computer Vision Models
A list of popular deep learning models related to classification, segmentation and detection problems
Stars: ✭ 278 (-88.53%)
Mutual labels:  image-classification, resnet, densenet
gluon2pytorch
Gluon to PyTorch deep neural network model converter
Stars: ✭ 72 (-97.03%)
Mutual labels:  densenet, resnet, nasnet
python cv AI ML
用python做计算机视觉,人工智能,机器学习,深度学习等
Stars: ✭ 73 (-96.99%)
Mutual labels:  densenet, resnet, googlenet
Bsconv
Reference implementation for Blueprint Separable Convolutions (CVPR 2020)
Stars: ✭ 84 (-96.53%)
Mutual labels:  image-classification, resnet, mobilenet
Pytorch Classification
Classification with PyTorch.
Stars: ✭ 1,268 (-47.67%)
Mutual labels:  resnet, densenet, resnext
Skin Lesions Classification DCNNs
Transfer Learning with DCNNs (DenseNet, Inception V3, Inception-ResNet V2, VGG16) for skin lesions classification
Stars: ✭ 47 (-98.06%)
Mutual labels:  image-classification, densenet, inceptionv3
Cifar Zoo
PyTorch implementation of CNNs for CIFAR benchmark
Stars: ✭ 584 (-75.9%)
Mutual labels:  resnet, densenet, resnext
Imageai
A python library built to empower developers to build applications and systems with self-contained Computer Vision capabilities
Stars: ✭ 6,734 (+177.92%)
Mutual labels:  densenet, squeezenet, inceptionv3

Pytorch-cifar100

practice on cifar100 using pytorch

Requirements

This is my experiment eviroument

  • python3.6
  • pytorch1.6.0+cu101
  • tensorboard 2.2.2(optional)

Usage

1. enter directory

$ cd pytorch-cifar100

2. dataset

I will use cifar100 dataset from torchvision since it's more convenient, but I also kept the sample code for writing your own dataset module in dataset folder, as an example for people don't know how to write it.

3. run tensorbard(optional)

Install tensorboard

$ pip install tensorboard
$ mkdir runs
Run tensorboard
$ tensorboard --logdir='runs' --port=6006 --host='localhost'

4. train the model

You need to specify the net you want to train using arg -net

# use gpu to train vgg16
$ python train.py -net vgg16 -gpu

sometimes, you might want to use warmup training by set -warm to 1 or 2, to prevent network diverge during early training phase.

The supported net args are:

squeezenet
mobilenet
mobilenetv2
shufflenet
shufflenetv2
vgg11
vgg13
vgg16
vgg19
densenet121
densenet161
densenet201
googlenet
inceptionv3
inceptionv4
inceptionresnetv2
xception
resnet18
resnet34
resnet50
resnet101
resnet152
preactresnet18
preactresnet34
preactresnet50
preactresnet101
preactresnet152
resnext50
resnext101
resnext152
attention56
attention92
seresnet18
seresnet34
seresnet50
seresnet101
seresnet152
nasnet
wideresnet
stochasticdepth18
stochasticdepth34
stochasticdepth50
stochasticdepth101

Normally, the weights file with the best accuracy would be written to the disk with name suffix 'best'(default in checkpoint folder).

5. test the model

Test the model using test.py

$ python test.py -net vgg16 -weights path_to_vgg16_weights_file

Implementated NetWork

Training Details

I didn't use any training tricks to improve accuray, if you want to learn more about training tricks, please refer to my another repo, contains various common training tricks and their pytorch implementations.

I follow the hyperparameter settings in paper Improved Regularization of Convolutional Neural Networks with Cutout, which is init lr = 0.1 divide by 5 at 60th, 120th, 160th epochs, train for 200 epochs with batchsize 128 and weight decay 5e-4, Nesterov momentum of 0.9. You could also use the hyperparameters from paper Regularizing Neural Networks by Penalizing Confident Output Distributions and Random Erasing Data Augmentation, which is initial lr = 0.1, lr divied by 10 at 150th and 225th epochs, and training for 300 epochs with batchsize 128, this is more commonly used. You could decrese the batchsize to 64 or whatever suits you, if you dont have enough gpu memory.

You can choose whether to use TensorBoard to visualize your training procedure

Results

The result I can get from a certain model, since I use the same hyperparameters to train all the networks, some networks might not get the best result from these hyperparameters, you could try yourself by finetuning the hyperparameters to get better result.

|dataset|network|params|top1 err|top5 err|epoch(lr = 0.1)|epoch(lr = 0.02)|epoch(lr = 0.004)|epoch(lr = 0.0008)|total epoch| |:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---: |cifar100|mobilenet|3.3M|34.02|10.56|60|60|40|40|200| |cifar100|mobilenetv2|2.36M|31.92|09.02|60|60|40|40|200| |cifar100|squeezenet|0.78M|30.59|8.36|60|60|40|40|200| |cifar100|shufflenet|1.0M|29.94|8.35|60|60|40|40|200| |cifar100|shufflenetv2|1.3M|30.49|8.49|60|60|40|40|200| |cifar100|vgg11_bn|28.5M|31.36|11.85|60|60|40|40|200| |cifar100|vgg13_bn|28.7M|28.00|9.71|60|60|40|40|200| |cifar100|vgg16_bn|34.0M|27.07|8.84|60|60|40|40|200| |cifar100|vgg19_bn|39.0M|27.77|8.84|60|60|40|40|200| |cifar100|resnet18|11.2M|24.39|6.95|60|60|40|40|200| |cifar100|resnet34|21.3M|23.24|6.63|60|60|40|40|200| |cifar100|resnet50|23.7M|22.61|6.04|60|60|40|40|200| |cifar100|resnet101|42.7M|22.22|5.61|60|60|40|40|200| |cifar100|resnet152|58.3M|22.31|5.81|60|60|40|40|200| |cifar100|preactresnet18|11.3M|27.08|8.53|60|60|40|40|200| |cifar100|preactresnet34|21.5M|24.79|7.68|60|60|40|40|200| |cifar100|preactresnet50|23.9M|25.73|8.15|60|60|40|40|200| |cifar100|preactresnet101|42.9M|24.84|7.83|60|60|40|40|200| |cifar100|preactresnet152|58.6M|22.71|6.62|60|60|40|40|200| |cifar100|resnext50|14.8M|22.23|6.00|60|60|40|40|200| |cifar100|resnext101|25.3M|22.22|5.99|60|60|40|40|200| |cifar100|resnext152|33.3M|22.40|5.58|60|60|40|40|200| |cifar100|attention59|55.7M|33.75|12.90|60|60|40|40|200| |cifar100|attention92|102.5M|36.52|11.47|60|60|40|40|200| |cifar100|densenet121|7.0M|22.99|6.45|60|60|40|40|200| |cifar100|densenet161|26M|21.56|6.04|60|60|60|40|200| |cifar100|densenet201|18M|21.46|5.9|60|60|40|40|200| |cifar100|googlenet|6.2M|21.97|5.94|60|60|40|40|200| |cifar100|inceptionv3|22.3M|22.81|6.39|60|60|40|40|200| |cifar100|inceptionv4|41.3M|24.14|6.90|60|60|40|40|200| |cifar100|inceptionresnetv2|65.4M|27.51|9.11|60|60|40|40|200| |cifar100|xception|21.0M|25.07|7.32|60|60|40|40|200| |cifar100|seresnet18|11.4M|23.56|6.68|60|60|40|40|200| |cifar100|seresnet34|21.6M|22.07|6.12|60|60|40|40|200| |cifar100|seresnet50|26.5M|21.42|5.58|60|60|40|40|200| |cifar100|seresnet101|47.7M|20.98|5.41|60|60|40|40|200| |cifar100|seresnet152|66.2M|20.66|5.19|60|60|40|40|200| |cifar100|nasnet|5.2M|22.71|5.91|60|60|40|40|200| |cifar100|wideresnet-40-10|55.9M|21.25|5.77|60|60|40|40|200| |cifar100|stochasticdepth18|11.22M|31.40|8.84|60|60|40|40|200| |cifar100|stochasticdepth34|21.36M|27.72|7.32|60|60|40|40|200| |cifar100|stochasticdepth50|23.71M|23.35|5.76|60|60|40|40|200| |cifar100|stochasticdepth101|42.69M|21.28|5.39|60|60|40|40|200|

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].