All Projects → d-li14 → regnet.pytorch

d-li14 / regnet.pytorch

Licence: MIT License
PyTorch-style and human-readable RegNet with a spectrum of pre-trained models

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to regnet.pytorch

Classification models
Classification models trained on ImageNet. Keras.
Stars: ✭ 938 (+1776%)
Mutual labels:  imagenet, pretrained-models, resnext
Hbonet
[ICCV 2019] Harmonious Bottleneck on Two Orthogonal Dimensions
Stars: ✭ 94 (+88%)
Mutual labels:  imagenet, pretrained-models
Pytorch Classification
Classification with PyTorch.
Stars: ✭ 1,268 (+2436%)
Mutual labels:  imagenet, resnext
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (+124%)
Mutual labels:  imagenet, neural-architecture-search
Segmentationcpp
A c++ trainable semantic segmentation library based on libtorch (pytorch c++). Backbone: ResNet, ResNext. Architecture: FPN, U-Net, PAN, LinkNet, PSPNet, DeepLab-V3, DeepLab-V3+ by now.
Stars: ✭ 49 (-2%)
Mutual labels:  imagenet, resnext
Pretrained Models.pytorch
Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.
Stars: ✭ 8,318 (+16536%)
Mutual labels:  imagenet, resnext
Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+3388%)
Mutual labels:  imagenet, pretrained-models
Neural Backed Decision Trees
Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet
Stars: ✭ 411 (+722%)
Mutual labels:  imagenet, pretrained-models
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+4710%)
Mutual labels:  imagenet, pretrained-models
Atomnas
Code for ICLR 2020 paper 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'
Stars: ✭ 197 (+294%)
Mutual labels:  imagenet, neural-architecture-search
ghostnet.pytorch
73.6% GhostNet 1.0x pre-trained model on ImageNet
Stars: ✭ 90 (+80%)
Mutual labels:  imagenet, pretrained-models
TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (+32%)
Mutual labels:  imagenet, neural-architecture-search
Randwirenn
Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (+1250%)
Mutual labels:  imagenet, neural-architecture-search
Caffe Model
Caffe models (including classification, detection and segmentation) and deploy files for famouse networks
Stars: ✭ 1,258 (+2416%)
Mutual labels:  imagenet, resnext
Mmclassification
OpenMMLab Image Classification Toolbox and Benchmark
Stars: ✭ 532 (+964%)
Mutual labels:  imagenet, resnext
Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (+104%)
Mutual labels:  imagenet, neural-architecture-search
Cnn Models
ImageNet pre-trained models with batch normalization for the Caffe framework
Stars: ✭ 355 (+610%)
Mutual labels:  imagenet, pretrained-models
Mobilenetv2.pytorch
72.8% MobileNetV2 1.0 model on ImageNet and a spectrum of pre-trained MobileNetV2 models
Stars: ✭ 369 (+638%)
Mutual labels:  imagenet, pretrained-models
Efficientnet
Implementation of EfficientNet model. Keras and TensorFlow Keras.
Stars: ✭ 1,920 (+3740%)
Mutual labels:  imagenet, pretrained-models
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+758%)
Mutual labels:  imagenet, pretrained-models

RegNet Implementation with TorchVision Style

PyTorch implementation of Designing Network Design Spaces by Ilija Radosavovic, Raj Prateek Kosaraju, Ross Girshick, Kaiming He, and Piotr Dollár.

Compared to the official codebase, this repository follows the torchvision's ResNeXt style, which is expected to be more easily interpreted and utilized by pre-existing downstream applications.

We train the following models on 8x TITAN XP GPUs with 12G VRAM. During the first five epochs, we linearly ramp up the learning rate from 0.1.

Pre-trained Models

Model Params (M) GFLOPs Batch size Top-1 acc (%) (our impl.) Top-1 acc (%) (official)
RegNetX-200M 2.685 0.199 1024 68.210 68.9
RegNetX-400M 5.158 0.398 1024 72.278 72.7
RegNetX-600M 6.196 0.601 1024 73.862 74.1
RegNetX-800M 7.260 0.800 1024 74.940 75.2
RegNetX-1.6G 9.190 1.603 1024 76.706 77.0
RegNetX-3.2G 15.296 3.177 512 78.188 78.3
RegNetX-4.0G 22.118 3.965 512 78.690 78.6
RegNetX-6.4G 26.209 6.460 512 79.152 79.2
RgeNetX-8.0G 39.573 7.995 512 79.380 79.3
RegNetX-12G 46.106 12.087 256 79.998 79.7
RegNetX-16G 54.279 15.941 256 80.118 80.0
RegNetX-32G 107.812 31.736 256 80.516 80.5

Citation

@InProceedings{Radosavovic_2020_CVPR,
author = {Radosavovic, Ilija and Kosaraju, Raj Prateek and Girshick, Ross and He, Kaiming and Doll{\'a}r, Piotr},
title = {Designing Network Design Spaces},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].