All Projects → changlin31 → DS-Net

changlin31 / DS-Net

Licence: other
(CVPR 2021, Oral) Dynamic Slimmable Network

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DS-Net

Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (-18.63%)
Mutual labels:  pruning, model-compression
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-75.49%)
Mutual labels:  pruning, model-compression
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-38.24%)
Mutual labels:  pruning, model-compression
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (-15.2%)
Mutual labels:  pruning, model-compression
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (-5.39%)
Mutual labels:  pruning, model-compression
Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+754.9%)
Mutual labels:  model-compression, efficient-inference
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-78.43%)
Mutual labels:  pruning, model-compression
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-79.9%)
Mutual labels:  pruning, model-compression
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+231.86%)
Mutual labels:  pruning, model-compression
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (+65.69%)
Mutual labels:  pruning, model-compression
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+503.92%)
Mutual labels:  pruning, model-compression
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+398.53%)
Mutual labels:  pruning, model-compression
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (+42.65%)
Mutual labels:  pruning, model-compression
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+386.27%)
Mutual labels:  pruning, model-compression
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+9.31%)
Mutual labels:  pruning, model-compression
Adventures In Tensorflow Lite
This repository contains notebooks that show the usage of TensorFlow Lite for quantizing deep neural networks.
Stars: ✭ 79 (-61.27%)
Mutual labels:  pruning
Nncf
PyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (+6.86%)
Mutual labels:  pruning
Model Compression And Acceleration Progress
Repository to track the progress in model compression and acceleration
Stars: ✭ 63 (-69.12%)
Mutual labels:  pruning
Grasp
Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH
Stars: ✭ 58 (-71.57%)
Mutual labels:  pruning
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (-71.57%)
Mutual labels:  pruning

Dynamic Slimmable Network (DS-Net)

This repository contains PyTorch code of our paper: Dynamic Slimmable Network (CVPR 2021 Oral).

image

Architecture of DS-Net. The width of each supernet stage is adjusted adaptively by the slimming ratio ρ predicted by the gate.

image

Accuracy vs. complexity on ImageNet.

Pretrained Supernet

  • Supernet Checkpoint

  • Here is a summary of sub-networks performance of the pretrained supernet:

    Subnetwork 0 1 2 3 4 5 6 7 8 9 10 11 12 13
    MAdds 133M 153M 175M 200M 226M 255M 286M 319M 355M 393M 433M 475M 519M 565M
    Top-1 (%) 70.1 70.4 70.8 71.2 71.6 72.0 72.4 72.7 73.0 73.3 73.6 73.9 74.1 74.6
    Top-5 (%) 89.4 89.6 89.9 90.2 90.3 90.6 90.9 91.0 91.2 91.4 91.5 91.7 91.8 92.0

Usage

1. Requirements

2. Stage I: Supernet Training

For example, train dynamic slimmable MobileNet supernet with 8 GPUs (takes about 2 days):

python -m torch.distributed.launch --nproc_per_node=8 train.py /PATH/TO/ImageNet -c ./configs/mobilenetv1_bn_uniform.yml

3. Stage II: Gate Training

  • Modify resume: in configs/mobilenetv1_bn_uniform_reset_bn.yml to your supernet checkpoint. Recalibrate BN before gate training

    python -m torch.distributed.launch --nproc_per_node=8 train.py /PATH/TO/ImageNet -c ./configs/mobilenetv1_bn_uniform_reset_bn.yml
    
  • Modify resume: in configs/mobilenetv1_bn_uniform_gate.yml to your supernet checkpoint after BN recalibration or our pretrained Supernet Checkpoint. Start gate training

    python -m torch.distributed.launch --nproc_per_node=8 train.py /PATH/TO/ImageNet -c ./configs/mobilenetv1_bn_uniform_gate.yml
    

Citation

If you use our code for your paper, please cite:

@inproceedings{li2021dynamic,
  author = {Changlin Li and
            Guangrun Wang and
            Bing Wang and
            Xiaodan Liang and
            Zhihui Li and
            Xiaojun Chang},
  title = {Dynamic Slimmable Network},
  booktitle = {CVPR},
  year = {2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].