All Projects → Friedrich1006 → ESNAC

Friedrich1006 / ESNAC

Licence: other
Learnable Embedding Space for Efficient Neural Architecture Compression

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ESNAC

Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+39522.22%)
Mutual labels:  bayesian-optimization, model-compression, neural-architecture-search
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+2407.41%)
Mutual labels:  model-compression, neural-architecture-search
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+574.07%)
Mutual labels:  bayesian-optimization, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+2459.26%)
Mutual labels:  model-compression, neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+1588.89%)
Mutual labels:  bayesian-optimization, neural-architecture-search
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (+288.89%)
Mutual labels:  bayesian-optimization, neural-architecture-search
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (+25.93%)
Mutual labels:  bayesian-optimization, neural-architecture-search
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (+344.44%)
Mutual labels:  bayesian-optimization, neural-architecture-search
Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (-29.63%)
Mutual labels:  model-compression, neural-architecture-search
CM-NAS
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)
Stars: ✭ 39 (+44.44%)
Mutual labels:  neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-3.7%)
Mutual labels:  neural-architecture-search
receptive field analysis toolbox
A toolbox for receptive field analysis and visualizing neural network architectures
Stars: ✭ 84 (+211.11%)
Mutual labels:  neural-architecture-search
AutoPrognosis
Codebase for "AutoPrognosis: Automated Clinical Prognostic Modeling via Bayesian Optimization", ICML 2018.
Stars: ✭ 47 (+74.07%)
Mutual labels:  bayesian-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-18.52%)
Mutual labels:  bayesian-optimization
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+366.67%)
Mutual labels:  model-compression
pcdarts-tf2
PC-DARTS (PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search, published in ICLR 2020) implemented in Tensorflow 2.0+. This is an unofficial implementation.
Stars: ✭ 25 (-7.41%)
Mutual labels:  neural-architecture-search
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (+33.33%)
Mutual labels:  model-compression
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+792.59%)
Mutual labels:  bayesian-optimization
deep-learning-roadmap
my own deep learning mastery roadmap
Stars: ✭ 40 (+48.15%)
Mutual labels:  neural-architecture-search
FastPose
pytorch realtime multi person keypoint estimation
Stars: ✭ 36 (+33.33%)
Mutual labels:  model-compression

ESNAC: Embedding Space for Neural Architecture Compression

This is the PyTorch implementation of our paper:

Learnable Embedding Space for Efficient Neural Architecture Compression.
Shengcao Cao*, Xiaofang Wang*, and Kris M. Kitani. ICLR 2019. [OpenReview] [arXiv].

Requirements

We recommend you to use this repository with Anaconda Python 3.7 and the following libraries:

Usage

  • Before running compression.py, you need to prepare the pretrained teacher models and put them at the folder ./models/pretrained. You can choose to train them on your own with train_model_teacher() in training.py, or download them at:

    We would like to point out that these provided pretrained teacher models are not trained on the full training set of CIFAR-10 or CIFAR-100. For both CIFAR-10 and CIFAR-100, we sample 5K images from the full training set as the validation set. The provided pretrained teacher models are trained on the remaining training images and are only used during the search process. The teacher accuracy reported in our paper refers to the accuracy of teacher models trained on the full training set of CIFAR-10 or CIFAR-100.

  • Then run the main program:

    python compression.py [-h] [--network NETWORK] [--dataset DATASET]
                          [--suffix SUFFIX] [--device DEVICE]
    

    For example, run

    python compression.py --network resnet34 --dataset cifar100 --suffix 0 --device cuda
    

    and you will see how the ResNet-34 architecture is compressed on the CIFAR-100 dataset using your GPU. The results will be saved at ./save/resnet34_cifar100_0 and the TensorBoard log will be saved at ./runs/resnet34_cifar100_0.

    Other hyper-parameters can be adjusted in options.py.

  • The whole process includes two stages: searching for desired compressed architectures, and fully train them. compression.py will do them both. Optionally, you can use TensorBoard to monitor the process through the log files.

  • After the compression, you can use the script stat.py to get the statistics of the compression results.

Random Seed and Reproducibility

To ensure reproducibility, we provide the compression results on CIFAR-100 with random seed 127. This seed value is randomly picked. You can try other seed values or comment out the call of seed_everything() in compression.py to obtain different results. Here are the compression results on CIFAR-100 when fixing the seed value to 127:

Teacher Accuracy #Params Ratio Times f(x)
VGG-19 71.64% 3.07M 0.8470 6.54× 0.9492
ResNet-18 71.91% 1.26M 0.8876 8.90× 0.9024
ResNet-34 75.47% 2.85M 0.8664 7.48× 0.9417
ShuffleNet 68.17% 0.18M 0.8298 5.88× 0.9305

Citation

If you find our work useful in your research, please consider citing our paper Learnable Embedding Space for Efficient Neural Architecture Compression:

@inproceedings{
  cao2018learnable,
  title={Learnable Embedding Space for Efficient Neural Architecture Compression},
  author={Shengcao Cao and Xiaofang Wang and Kris M. Kitani},
  booktitle={International Conference on Learning Representations},
  year={2019},
  url={https://openreview.net/forum?id=S1xLN3C9YX},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].