All Projects → naszilla → nas-encodings

naszilla / nas-encodings

Licence: Apache-2.0 License
Encodings for neural architecture search

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects

Projects that are alternatives of or similar to nas-encodings

BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (+331.03%)
Mutual labels:  nas, automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+36789.66%)
Mutual labels:  nas, automl, neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+3993.1%)
Mutual labels:  nas, automl, neural-architecture-search
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (+334.48%)
Mutual labels:  nas, automl, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-10.34%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Nas Papers
Awesome Neural Architecture Search Papers
Stars: ✭ 213 (+634.48%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+6172.41%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+2282.76%)
Mutual labels:  nas, automl, neural-architecture-search
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+662.07%)
Mutual labels:  nas, automl, neural-architecture-search
Once For All
[ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment
Stars: ✭ 1,127 (+3786.21%)
Mutual labels:  nas, automl
Shape Adaptor
The implementation of "Shape Adaptor: A Learnable Resizing Module" [ECCV 2020].
Stars: ✭ 59 (+103.45%)
Mutual labels:  nas, automl
Fairdarts
Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search
Stars: ✭ 145 (+400%)
Mutual labels:  nas, automl
TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (+127.59%)
Mutual labels:  nas, neural-architecture-search
Autodl
Automated Deep Learning without ANY human intervention. 1'st Solution for AutoDL [email protected]
Stars: ✭ 854 (+2844.83%)
Mutual labels:  nas, automl
Dna
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (+406.9%)
Mutual labels:  nas, neural-architecture-search
AutoSpeech
[InterSpeech 2020] "AutoSpeech: Neural Architecture Search for Speaker Recognition" by Shaojin Ding*, Tianlong Chen*, Xinyu Gong, Weiwei Zha, Zhangyang Wang
Stars: ✭ 195 (+572.41%)
Mutual labels:  automl, neural-architecture-search
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+2234.48%)
Mutual labels:  nas, neural-architecture-search
HyperKeras
An AutoDL tool for Neural Architecture Search and Hyperparameter Optimization on Tensorflow and Keras
Stars: ✭ 29 (+0%)
Mutual labels:  automl, neural-architecture-search
CM-NAS
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)
Stars: ✭ 39 (+34.48%)
Mutual labels:  nas, neural-architecture-search
deep-learning-roadmap
my own deep learning mastery roadmap
Stars: ✭ 40 (+37.93%)
Mutual labels:  nas, neural-architecture-search

A Study on Encodings for Neural Architecture Search

Note: this repository has been combined with other naszilla projects into naszilla/naszilla. This repo is deprecated and not maintained. Please use naszilla/naszilla, which has more functionality.

A Study on Encodings for Neural Architecture Search
Colin White, Willie Neiswanger, Sam Nolen, and Yash Savani.
arxiv:2007.04965.

Many algorithms for neural architecture search (NAS) represent each neural architecture in the search space as a directed acyclic graph (DAG), and then search over all DAGs by encoding the adjacency matrix and list of operations as a set of hyperparameters. Recent work has demonstrated that even small changes to the way each architecture is encoded can have a significant effect on the performance of NAS algorithms. We present the first formal study on the effect of architecture encodings for NAS.

Requirements

  • jupyter
  • tensorflow == 1.14.0 (used for all experiments)
  • nasbench (follow the installation instructions here)
  • nas-bench-201 (follow the installation instructions here)
  • pytorch == 1.2.0, torchvision == 0.4.0 (used for experiments on the DARTS search space)
  • pybnn (used only for the DNGO baselien algorithm. Installation instructions here)

If you run experiments on the DARTS search space, you will need our fork of the DARTS repo:

  • Download our fork of the DARTS repo: https://github.com/naszilla/darts
  • If you don't put the repo in your home directory, i.e., ~/darts, then update line 7 of nas-encodings/darts/arch.py and line 8 of nas-encodings/train_arch_runner.py with the correct path.

Download nasbench-101

  • Download the nasbench_only108 tfrecord file (size 499MB) here
  • Place nasbench_only108.tfrecord in the top level folder of this repo

Download index-hash

Some of the path-based encoding methods require a hash map from path indices to cell architectures. We have created a pickle file which contains this hash map (size 57MB), located here. Place it in the top level folder of this repo.

Get started quickly: open jupyter notebook

  • The easiest way to get started is to run one of our jupyter notebooks
  • Open and run meta_neuralnet.ipynb to train a neural predictor with different encodings
  • Open and run notebooks/test_nas.ipynb to test out each algorithm + encoding combination

Run experiments on nasbench-101

python run_experiments_sequential.py --algo_params evo_encodings

This command will run evolutionary search with six different encodings. To run other experiments, open up params.py.

Run experiments on nasbench-201

To run experiments with NAS-Bench-201, download NAS-Bench-201-v1_0-e61699.pth from here and place it in the top level folder of this repo. Choose between cifar10, cifar100, and imagenet. For example,

python run_experiments_sequential.py --algo_params evo_encodings --search_space nasbench_201_cifar10

Citation

Please cite our paper if you use code from this repo:

@inproceedings{white2020study,
  title={A Study on Encodings for Neural Architecture Search},
  author={White, Colin and Neiswanger, Willie and Nolen, Sam and Savani, Yash},
  booktitle={Advances in Neural Information Processing Systems},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].