All Projects → d-li14 → ghostnet.pytorch

d-li14 / ghostnet.pytorch

Licence: MIT license
73.6% GhostNet 1.0x pre-trained model on ImageNet

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ghostnet.pytorch

Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+1837.78%)
Mutual labels:  imagenet, pretrained-models, ghostnet
Mobilenetv3.pytorch
74.3% MobileNetV3-Large and 67.2% MobileNetV3-Small model on ImageNet
Stars: ✭ 283 (+214.44%)
Mutual labels:  imagenet, pretrained-models
Efficientnet Pytorch
A PyTorch implementation of EfficientNet and EfficientNetV2 (coming soon!)
Stars: ✭ 6,685 (+7327.78%)
Mutual labels:  imagenet, pretrained-models
Mobilenetv2.pytorch
72.8% MobileNetV2 1.0 model on ImageNet and a spectrum of pre-trained MobileNetV2 models
Stars: ✭ 369 (+310%)
Mutual labels:  imagenet, pretrained-models
ModelZoo.pytorch
Hands on Imagenet training. Unofficial ModelZoo project on Pytorch. MobileNetV3 Top1 75.64🌟 GhostNet1.3x 75.78🌟
Stars: ✭ 42 (-53.33%)
Mutual labels:  imagenet, mobilenetv3
pigallery
PiGallery: AI-powered Self-hosted Secure Multi-user Image Gallery and Detailed Image analysis using Machine Learning, EXIF Parsing and Geo Tagging
Stars: ✭ 35 (-61.11%)
Mutual labels:  imagenet, pretrained-models
Cnn Models
ImageNet pre-trained models with batch normalization for the Caffe framework
Stars: ✭ 355 (+294.44%)
Mutual labels:  imagenet, pretrained-models
Segmentation models.pytorch
Segmentation models with pretrained backbones. PyTorch.
Stars: ✭ 4,584 (+4993.33%)
Mutual labels:  imagenet, pretrained-models
Classification models
Classification models trained on ImageNet. Keras.
Stars: ✭ 938 (+942.22%)
Mutual labels:  imagenet, pretrained-models
Hbonet
[ICCV 2019] Harmonious Bottleneck on Two Orthogonal Dimensions
Stars: ✭ 94 (+4.44%)
Mutual labels:  imagenet, pretrained-models
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+376.67%)
Mutual labels:  imagenet, pretrained-models
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+2572.22%)
Mutual labels:  imagenet, pretrained-models
Pytorch Image Models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
Stars: ✭ 15,232 (+16824.44%)
Mutual labels:  pretrained-models, mobilenetv3
regnet.pytorch
PyTorch-style and human-readable RegNet with a spectrum of pre-trained models
Stars: ✭ 50 (-44.44%)
Mutual labels:  imagenet, pretrained-models
Neural Backed Decision Trees
Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet
Stars: ✭ 411 (+356.67%)
Mutual labels:  imagenet, pretrained-models
Efficientnet
Implementation of EfficientNet model. Keras and TensorFlow Keras.
Stars: ✭ 1,920 (+2033.33%)
Mutual labels:  imagenet, pretrained-models
quarkdet
QuarkDet lightweight object detection in PyTorch .Real-Time Object Detection on Mobile Devices.
Stars: ✭ 82 (-8.89%)
Mutual labels:  mobilenetv3, ghostnet
cozmo-tensorflow
🤖 Cozmo the Robot recognizes objects with TensorFlow
Stars: ✭ 61 (-32.22%)
Mutual labels:  imagenet
cisip-FIRe
Fast Image Retrieval (FIRe) is an open source project to promote image retrieval research. It implements most of the major binary hashing methods to date, together with different popular backbone networks and public datasets.
Stars: ✭ 40 (-55.56%)
Mutual labels:  imagenet
machine learning course
Artificial intelligence/machine learning course at UCF in Spring 2020 (Fall 2019 and Spring 2019)
Stars: ✭ 47 (-47.78%)
Mutual labels:  pretrained-models

PyTorch Implementation of GhostNet

Reproduction of GhostNet architecture as described in GhostNet: More Features from Cheap Operations by Kai Han, Yunhe Wang, Qi Tian, Jianyuan Guo, Chunjing Xu, Chang Xu on ILSVRC2012 benchmark with PyTorch framework.

Pretrained Models

Architecture # Parameters MFLOPs Top-1 / Top-5 Accuracy (%)
GhostNet 1.0x 5.181M 140.77 73.636 / 91.228
from ghostnet import ghostnet

net = ghostnet()
net.load_state_dict(torch.load('pretrained/ghostnet_1x-9c40f966.pth'))

Training Strategy

  • batch size 1024 on 8 GPUs
  • Initial learning rate 0.4
  • weight decay 0.00004
  • dropout rate 0.2
  • no weight decay on BN

We keep the above settings as the same and conduct experiments with different training techniques below for ablation and reproduction. During the warmup phase, learning rate linearly ramps up from 0.1 to 0.4.

epoch LR annealing warmup label smooth Top-1 / Top-5 Accuracy (%)
240 linear × × 72.318 / 90.670
360 linear × × 72.458 / 90.780
240 cosine × 72.772 / 90.902
240 cosine 73.636 / 91.228

Citation

@inproceedings{Han_2020_CVPR,
  title={GhostNet: More Features from Cheap Operations},
  author={Han, Kai and Wang, Yunhe and Tian, Qi and Guo, Jianyuan and Xu, Chunjing and Xu, Chang},
  booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  month = {June},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].