All Projects → mit-han-lab → Amc

mit-han-lab / Amc

Licence: mit
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Amc

Amc Models
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 154 (-48.32%)
Mutual labels:  automl, model-compression
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+131.88%)
Mutual labels:  automl, model-compression
Pocketflow
An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
Stars: ✭ 2,672 (+796.64%)
Mutual labels:  automl, model-compression
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+3489.93%)
Mutual labels:  automl, model-compression
allie
🤖 A machine learning framework for audio, text, image, video, or .CSV files (50+ featurizers and 15+ model trainers).
Stars: ✭ 93 (-68.79%)
Mutual labels:  automl, model-compression
MLSample.SimpleTransactionTagging
This is an simple example of tagging bank transactions with ML.NET
Stars: ✭ 13 (-95.64%)
Mutual labels:  automl
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-88.59%)
Mutual labels:  automl
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-90.27%)
Mutual labels:  automl
Auto-Surprise
An AutoRecSys library for Surprise. Automate algorithm selection and hyperparameter tuning 🚀
Stars: ✭ 19 (-93.62%)
Mutual labels:  automl
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+1215.44%)
Mutual labels:  automl
My Data Competition Experience
本人多次机器学习与大数据竞赛Top5的经验总结,满满的干货,拿好不谢
Stars: ✭ 271 (-9.06%)
Mutual labels:  automl
FEDOT
Automated modeling and machine learning framework FEDOT
Stars: ✭ 312 (+4.7%)
Mutual labels:  automl
autodo
Official PyTorch code for CVPR 2021 paper "AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implicit Differentiation"
Stars: ✭ 19 (-93.62%)
Mutual labels:  automl
Meta-SAC
Auto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-93.62%)
Mutual labels:  automl
A- Guide -to Data Sciecne from mathematics
It is a blueprint to data science from the mathematics to algorithms. It is not completed.
Stars: ✭ 25 (-91.61%)
Mutual labels:  model-compression
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+973.15%)
Mutual labels:  automl
DLCV2018SPRING
Deep Learning for Computer Vision (CommE 5052) in NTU
Stars: ✭ 38 (-87.25%)
Mutual labels:  model-compression
sparsify
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (-53.69%)
Mutual labels:  automl
Autodeeplab
AutoDeeplab / auto-deeplab / AutoML for semantic segmentation, implemented in Pytorch
Stars: ✭ 269 (-9.73%)
Mutual labels:  automl
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-83.22%)
Mutual labels:  model-compression

AutoML for Model Compression (AMC)

This repo contains the PyTorch implementation for paper AMC: AutoML for Model Compression and Acceleration on Mobile Devices.

overview

Reference

If you find the repo useful, please kindly cite our paper:

@inproceedings{he2018amc,
  title={AMC: AutoML for Model Compression and Acceleration on Mobile Devices},
  author={He, Yihui and Lin, Ji and Liu, Zhijian and Wang, Hanrui and Li, Li-Jia and Han, Song},
  booktitle={European Conference on Computer Vision (ECCV)},
  year={2018}
}

Other papers related to automated model design:

  • HAQ: Hardware-Aware Automated Quantization with Mixed Precision (CVPR 2019)

  • ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware (ICLR 2019)

Training AMC

Current code base supports the automated pruning of MobileNet on ImageNet. The pruning of MobileNet consists of 3 steps: 1. strategy search; 2. export the pruned weights; 3. fine-tune from pruned weights.

To conduct the full pruning procedure, follow the instructions below (results might vary a little from the paper due to different random seed):

  1. Strategy Search

    To search the strategy on MobileNet ImageNet model, first get the pretrained MobileNet checkpoint on ImageNet by running:

    bash ./checkpoints/download.sh
    

    It will also download our 50% FLOPs compressed model. Then run the following script to search under 50% FLOPs constraint:

    bash ./scripts/search_mobilenet_0.5flops.sh
    

    Results may differ due to different random seed. The strategy we found and reported in the paper is:

    [3, 24, 48, 96, 80, 192, 200, 328, 352, 368, 360, 328, 400, 736, 752]
    
  2. Export the Pruned Weights

    After searching, we need to export the pruned weights by running:

    bash ./scripts/export_mobilenet_0.5flops.sh
    

    Also we need to modify MobileNet file to support the new pruned model (here it is already done in models/mobilenet.py)

  3. Fine-tune from Pruned Weightsa

    After exporting, we need to fine-tune from the pruned weights. For example, we can fine-tune using cosine learning rate for 150 epochs by running:

    bash ./scripts/finetune_mobilenet_0.5flops.sh
    

AMC Compressed Model

We also provide the models and weights compressed by our AMC method. We provide compressed MobileNet-V1 and MobileNet-V2 in both PyTorch and TensorFlow format here.

Detailed statistics are as follows:

Models Top1 Acc (%) Top5 Acc (%)
MobileNetV1-width*0.75 68.4 88.2
MobileNetV1-50%FLOPs 70.494 89.306
MobileNetV1-50%Time 70.200 89.430
MobileNetV2-width*0.75 69.8 89.6
MobileNetV2-70%FLOPs 70.854 89.914

Dependencies

Current code base is tested under following environment:

  1. Python 3.7.3
  2. PyTorch 1.1.0
  3. torchvision 0.2.1
  4. NumPy 1.14.3
  5. SciPy 1.1.0
  6. scikit-learn 0.19.1
  7. tensorboardX
  8. ImageNet dataset

Contact

To contact the authors:

Ji Lin, [email protected]

Song Han, [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].