All Projects → naszilla → Naszilla

naszilla / Naszilla

Licence: apache-2.0
Naszilla is a Python library for neural architecture search (NAS)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Naszilla

Efficientnas
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-75.69%)
Mutual labels:  convolutional-neural-networks, automl, neural-architecture-search
Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+1813.26%)
Mutual labels:  automl, neural-architecture-search, convolutional-networks
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-38.12%)
Mutual labels:  automl, neural-architecture-search, cifar10
Mtlnas
[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-67.96%)
Mutual labels:  convolutional-neural-networks, automl, neural-architecture-search
Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-43.65%)
Mutual labels:  automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+5810.5%)
Mutual labels:  automl, neural-architecture-search
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-35.36%)
Mutual labels:  automl, neural-architecture-search
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-30.39%)
Mutual labels:  automl, neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+555.8%)
Mutual labels:  automl, neural-architecture-search
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (-34.25%)
Mutual labels:  automl, neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+904.97%)
Mutual labels:  automl, neural-architecture-search
Lsuvinit
Reference caffe implementation of LSUV initialization
Stars: ✭ 99 (-45.3%)
Mutual labels:  convolutional-neural-networks, convolutional-networks
Fast Autoaugment
Official Implementation of 'Fast AutoAugment' in PyTorch.
Stars: ✭ 1,297 (+616.57%)
Mutual labels:  convolutional-neural-networks, automl
Livianet
This repository contains the code of LiviaNET, a 3D fully convolutional neural network that was employed in our work: "3D fully convolutional networks for subcortical segmentation in MRI: A large-scale study"
Stars: ✭ 143 (-20.99%)
Mutual labels:  convolutional-neural-networks, convolutional-networks
Tensorflow Cifar 10
Cifar-10 CNN implementation using TensorFlow library with 20% error.
Stars: ✭ 85 (-53.04%)
Mutual labels:  convolutional-neural-networks, cifar10
Keras transfer cifar10
Object classification with CIFAR-10 using transfer learning
Stars: ✭ 120 (-33.7%)
Mutual labels:  convolutional-neural-networks, convolutional-networks
Sgas
SGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-24.31%)
Mutual labels:  automl, neural-architecture-search
Antialiased Cnns
Repository has been moved: https://github.com/adobe/antialiased-cnns
Stars: ✭ 169 (-6.63%)
Mutual labels:  convolutional-neural-networks, convolutional-networks
Tf Adnet Tracking
Deep Object Tracking Implementation in Tensorflow for 'Action-Decision Networks for Visual Tracking with Deep Reinforcement Learning(CVPR 2017)'
Stars: ✭ 162 (-10.5%)
Mutual labels:  convolutional-neural-networks, convolutional-networks
Gtsrb
Convolutional Neural Network for German Traffic Sign Recognition Benchmark
Stars: ✭ 65 (-64.09%)
Mutual labels:  convolutional-neural-networks, convolutional-networks

License

A repository to compare many popular NAS algorithms seamlessly across three popular benchmarks (NASBench 101, 201, and 301). You can implement your own NAS algorithm, and then easily compare it with eleven algorithms across three benchmarks.

This repository contains the official code for the following three papers, including a NeurIPS2020 spotlight paper:

Paper README Blog Post
A Study on Encodings for Neural Architecture Search encodings.md Blog Post
BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search bananas.md Blog Post
Local Search is State of the Art for Neural Architecture Search Benchmarks local_search.md Blog Post

Installation

Clone this repository and install its requirements (which includes nasbench, nas-bench-201, and nasbench301). It may take a few minutes.

git clone https://github.com/naszilla/naszilla
cd naszilla
cat requirements.txt | xargs -n 1 -L 1 pip install
pip install -e .

You might need to replace line 32 of src/nasbench301/surrogate_models/surrogate_models.py with a new path to the configspace file:

self.config_loader = utils.ConfigLoader(os.path.expanduser('~/naszilla/src/nasbench301/configspace.json'))

Next, download the nas benchmark datasets (either with the terminal commands below, or from their respective websites (nasbench, nas-bench-201, and nasbench301). The versions recommended for use with naszilla are nasbench_only108.tfrecord, NAS-Bench-201-v1_0-e61699.pth, and nasbench301_models_v0.9.zip. If you use a different version, you might need to edit some of the naszilla code.

# these files are 0.5GB, 2.1GB, and 1.6GB, respectively
wget https://storage.googleapis.com/nasbench/nasbench_only108.tfrecord
wget https://ndownloader.figshare.com/files/25506206?private_link=7d47bf57803227af4909 -O NAS-Bench-201-v1_0-e61699.pth
wget https://ndownloader.figshare.com/files/24693026 -O nasbench301_models_v0.9.zip
unzip nasbench301_models_v0.9.zip

Place the three downloaded benchmark data files in ~/nas_benchmark_datasets (or choose another directory and edit line 15 of naszilla/nas_benchmarks.py accordingly).

Now you have successfully installed all of the requirements to run eleven NAS algorithms on three benchmark search spaces!

Test Installation

You can test the installation by running these commands:

cd naszilla
python naszilla/run_experiments.py --search_space nasbench_101 --algo_params all_algos --queries 30 --trials 1
python naszilla/run_experiments.py --search_space nasbench_201 --algo_params all_algos --queries 30 --trials 1
python naszilla/run_experiments.py --search_space nasbench_301 --algo_params all_algos --queries 30 --trials 1

These experiments should finish running within a few minutes.

Run NAS experiments on NASBench-101/201/301 search spaces

cd naszilla
python naszilla/run_experiments.py --search_space nasbench_201 --dataset cifar100 --queries 100 --trials 100

This will test several NAS algorithms against each other on the NASBench-201 search space. Note that NASBench-201 allows you to specify one of three datasets: cifar10, cifar100, or imagenet. To customize your experiment, open naszilla/params.py. Here, you can change the algorithms and their hyperparameters. For details on running specific methods, see these docs.

Contributions

Contributions are welcome!

Reproducibility

If you have any questions about reproducing an experiment, please open an issue or email [email protected].

Citation

Please cite our papers if you use code from this repo:

@inproceedings{white2020study,
  title={A Study on Encodings for Neural Architecture Search},
  author={White, Colin and Neiswanger, Willie and Nolen, Sam and Savani, Yash},
  booktitle={Advances in Neural Information Processing Systems},
  year={2020}
}

@inproceedings{white2019bananas,
  title={BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search},
  author={White, Colin and Neiswanger, Willie and Savani, Yash},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2021}
}

@article{white2020local,
  title={Local Search is State of the Art for Neural Architecture Search Benchmarks},
  author={White, Colin and Nolen, Sam and Savani, Yash},
  journal={arXiv preprint arXiv:2005.02960},
  year={2020}
}

Contents

This repo contains encodings for neural architecture search, a variety of NAS methods (including BANANAS, a neural predictor Bayesian optimization method, and local search for NAS), and an easy interface for using multiple NAS benchmarks.

Encodings:

encodings

BANANAS:

adj_train adj_test path_train path_test

Local search:

local_search

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].