All Projects → antoyang → Nas Benchmark

antoyang / Nas Benchmark

Licence: gpl-3.0
"NAS evaluation is frustratingly hard", ICLR2020

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Nas Benchmark

BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-0.79%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+1343.65%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+448.41%)
Mutual labels:  nas, automl, neural-architecture-search
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+75.4%)
Mutual labels:  nas, automl, neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+842.06%)
Mutual labels:  nas, automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+8390.48%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Nas Papers
Awesome Neural Architecture Search Papers
Stars: ✭ 213 (+69.05%)
Mutual labels:  nas, automl, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-79.37%)
Mutual labels:  nas, automl, neural-architecture-search
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-76.98%)
Mutual labels:  nas, automl, neural-architecture-search
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+643.65%)
Mutual labels:  automl, neural-architecture-search
Autodl
Automated Deep Learning without ANY human intervention. 1'st Solution for AutoDL [email protected]
Stars: ✭ 854 (+577.78%)
Mutual labels:  nas, automl
Awesome Semantic Segmentation
🤘 awesome-semantic-segmentation
Stars: ✭ 8,831 (+6908.73%)
Mutual labels:  evaluation, benchmark
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-7.14%)
Mutual labels:  automl, neural-architecture-search
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+634.13%)
Mutual labels:  automl, neural-architecture-search
Efficientnas
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-65.08%)
Mutual labels:  automl, neural-architecture-search
Mtlnas
[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-53.97%)
Mutual labels:  automl, neural-architecture-search
Shape Adaptor
The implementation of "Shape Adaptor: A Learnable Resizing Module" [ECCV 2020].
Stars: ✭ 59 (-53.17%)
Mutual labels:  nas, automl
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+437.3%)
Mutual labels:  nas, neural-architecture-search
Autokeras
AutoML library for deep learning
Stars: ✭ 8,269 (+6462.7%)
Mutual labels:  automl, neural-architecture-search
Once For All
[ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment
Stars: ✭ 1,127 (+794.44%)
Mutual labels:  nas, automl

NAS-Benchmark

This repository includes the code used to evaluate NAS methods on 5 different datasets, as well as the code used to augment architectures with different protocols, as mentioned in our ICLR2020 paper (https://arxiv.org/abs/1912.12522). Scripts exemples are provided in each folder.

ICLR2020 video poster presentation

The video from our ICLR2020 poster presentation is available at https://iclr.cc/virtual_2020/poster_HygrdpVKvr.html.

Plots

All code used to generate the plots of the paper can be found in the "Plots" folder.

Randomly Sampled Architectures

You can find all sampled architectures and corresponding training logs in Plots\data\modified_search_space.

Data

In the data folder, you will find the data splits for Sport-8, MIT-67 and Flowers-102 in .csv files.

You can download these datasets on the following web sites :

Sport-8 : http://vision.stanford.edu/lijiali/event_dataset/

MIT-67 : http://web.mit.edu/torralba/www/indoor.html

Flowers-102 : http://www.robots.ox.ac.uk/~vgg/data/flowers/102/

The data path has to be set the following way : dataset/train/classes/images for the training set, dataset/test/classes/images for the test set.

We used the following repositories :

DARTS

Paper : Liu, Hanxiao, Karen Simonyan, and Yiming Yang. "Darts: Differentiable architecture search." arXiv preprint arXiv:1806.09055 (2018).

Unofficial updated implementation : https://github.com/khanrc/pt.darts

P-DARTS

Paper : Xin Chen, Lingxi Xie, Jun Wu, Qi Tian. "Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation." ICCV, 2019.

Official implementation : https://github.com/chenxin061/pdarts

CNAS

Paper : Weng, Yu, et al. "Automatic Convolutional Neural Architecture Search for Image Classification Under Different Scenes." IEEE Access 7 (2019): 38495-38506.

Official implementation : https://github.com/tianbaochou/CNAS

StacNAS

Paper : Guilin Li et al. "StacNAS: Towards Stable and Consistent Differentiable Neural Architecture Search." arXiv preprint arXiv:1909.11926 (2019).

Implementation : provided by the authors

ENAS

Paper : Pham, Hieu, et al. "Efficient neural architecture search via parameter sharing." arXiv preprint arXiv:1802.03268 (2018).

Official Tensorflow implementation : https://github.com/melodyguan/enas

Unofficial Pytorch implementation : https://github.com/MengTianjian/enas-pytorch

MANAS

Paper : Maria Carlucci, Fabio, et al. "MANAS: Multi-Agent Neural Architecture Search." arXiv preprint arXiv:1909.01051 (2019).

Implementation : provided by the authors.

NSGA-NET

Paper : Lu, Zhichao, et al. "NSGA-NET: a multi-objective genetic algorithm for neural architecture search." arXiv preprint arXiv:1810.03522 (2018).

Official implementation : https://github.com/ianwhale/nsga-net

NAO

Paper : Luo, Renqian, et al. "Neural architecture optimization." Advances in neural information processing systems. 2018.

Official Pytorch implementation : https://github.com/renqianluo/NAO_pytorch

For the two following methods, we have not yet performed consistent experiments (therefore the methods are not included in the paper). Nonetheless, we provide runnable code that could provide relevant insights (similar to those provided in the paper on the other methods) on these methods.

PC-DARTS

Paper : Xu, Yuhui, et al. "PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search." arXiv preprint arXiv:1907.05737 (2019).

Official implementation : https://github.com/yuhuixu1993/PC-DARTS

PRDARTS

Paper : Laube, Kevin Alexander, and Andreas Zell. "Prune and Replace NAS." arXiv preprint arXiv:1906.07528 (2019).

Official implementation : https://github.com/cogsys-tuebingen/prdarts

AutoAugment

Paper : Cubuk, Ekin D., et al. "Autoaugment: Learning augmentation policies from data." arXiv preprint arXiv:1805.09501 (2018).

Unofficial Pytorch implementation : https://github.com/DeepVoltaire/AutoAugment

Citation

If you found this work useful, consider citing us:

@inproceedings{
Yang2020NAS,
title={NAS evaluation is frustratingly hard},
author={Antoine Yang and Pedro M. Esperança and Fabio M. Carlucci},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=HygrdpVKvr}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].