All Projects → VITA-Group → Tenas

VITA-Group / Tenas

Licence: mit
[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Tenas

Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+5396.83%)
Mutual labels:  neural-architecture-search
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+974.6%)
Mutual labels:  neural-architecture-search
Efficientnas
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-30.16%)
Mutual labels:  neural-architecture-search
Neural Architecture Search
Basic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv.org/abs/1611.01578).
Stars: ✭ 352 (+458.73%)
Mutual labels:  neural-architecture-search
Awesome Federated Learning
Federated Learning Library: https://fedml.ai
Stars: ✭ 624 (+890.48%)
Mutual labels:  neural-architecture-search
Slimmable networks
Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019
Stars: ✭ 708 (+1023.81%)
Mutual labels:  neural-architecture-search
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+390.48%)
Mutual labels:  neural-architecture-search
Awesome Architecture Search
A curated list of awesome architecture search resources
Stars: ✭ 1,078 (+1611.11%)
Mutual labels:  neural-architecture-search
Randwirenn
Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (+971.43%)
Mutual labels:  neural-architecture-search
Neural Architecture Search With Rl
Minimal Tensorflow implementation of the paper "Neural Architecture Search With Reinforcement Learning" presented at ICLR 2017
Stars: ✭ 37 (-41.27%)
Mutual labels:  neural-architecture-search
Autogan
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
Stars: ✭ 388 (+515.87%)
Mutual labels:  neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+623.81%)
Mutual labels:  neural-architecture-search
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+1368.25%)
Mutual labels:  neural-architecture-search
Adanet
Fast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+5201.59%)
Mutual labels:  neural-architecture-search
Autokeras
AutoML library for deep learning
Stars: ✭ 8,269 (+13025.4%)
Mutual labels:  neural-architecture-search
Real Time Network
real-time network architecture for mobile devices and semantic segmentation
Stars: ✭ 308 (+388.89%)
Mutual labels:  neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+996.83%)
Mutual labels:  neural-architecture-search
Mtlnas
[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-7.94%)
Mutual labels:  neural-architecture-search
Nsganetv2
[ECCV2020] NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search
Stars: ✭ 52 (-17.46%)
Mutual labels:  neural-architecture-search
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+1387.3%)
Mutual labels:  neural-architecture-search

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective [PDF]

Language grade: Python MIT licensed

Wuyang Chen, Xinyu Gong, Zhangyang Wang

In ICLR 2021.

Overview

We present TE-NAS, the first published training-free neural architecture search method with extremely fast search speed (no gradient descent at all!) and high-quality performance.

Highlights:

  • Trainig-free and label-free NAS: we achieved extreme fast neural architecture search without a single gradient descent.
  • Bridging the theory-application gap: We identified two training-free indicators to rank the quality of deep networks: the condition number of their NTKs, and the number of linear regions in their input space.
  • SOTA: TE-NAS achieved extremely fast search speed (one 1080Ti, 20 minutes on NAS-Bench-201 space / four hours on DARTS space on ImageNet) and maintains competitive accuracy.

Prerequisites

  • Ubuntu 16.04
  • Python 3.6.9
  • CUDA 10.1 (lower versions may work but were not tested)
  • NVIDIA GPU + CuDNN v7.3

This repository has been tested on GTX 1080Ti. Configurations may need to be changed on different platforms.

Installation

  • Clone this repo:
git clone https://github.com/chenwydj/TENAS.git
cd TENAS
  • Install dependencies:
pip install -r requirements.txt

Usage

0. Prepare the dataset

  • Please follow the guideline here to prepare the CIFAR-10/100 and ImageNet dataset, and also the NAS-Bench-201 database.
  • Remember to properly set the TORCH_HOME and data_paths in the prune_launch.py.

1. Search

NAS-Bench-201 Space

python prune_launch.py --space nas-bench-201 --dataset cifar10 --gpu 0
python prune_launch.py --space nas-bench-201 --dataset cifar100 --gpu 0
python prune_launch.py --space nas-bench-201 --dataset ImageNet16-120 --gpu 0

DARTS Space (NASNET)

python prune_launch.py --space darts --dataset cifar10 --gpu 0
python prune_launch.py --space darts --dataset imagenet-1k --gpu 0

2. Evaluation

  • For architectures searched on nas-bench-201, the accuracies are immediately available at the end of search (from the console output).
  • For architectures searched on darts, please use DARTS_evaluation for training the searched architecture from scratch and evaluation.

Citation

@inproceedings{chen2020tenas,
  title={Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective},
  author={Chen, Wuyang and Gong, Xinyu and Wang, Zhangyang},
  booktitle={International Conference on Learning Representations},
  year={2021}
}

Acknowledgement

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].