All Projects → ianwhale → Nsga Net

ianwhale / Nsga Net

NSGA-Net, a Neural Architecture Search Algorithm

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Nsga Net

Hydra
Multi-Task Learning Framework on PyTorch. State-of-the-art methods are implemented to effectively train models on multiple tasks.
Stars: ✭ 87 (-49.12%)
Mutual labels:  neural-architecture-search
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (-30.41%)
Mutual labels:  neural-architecture-search
Sgas
SGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-19.88%)
Mutual labels:  neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6156.14%)
Mutual labels:  neural-architecture-search
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-34.5%)
Mutual labels:  neural-architecture-search
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-26.32%)
Mutual labels:  neural-architecture-search
Tenas
[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang
Stars: ✭ 63 (-63.16%)
Mutual labels:  neural-architecture-search
Dna
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-14.04%)
Mutual labels:  neural-architecture-search
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-31.58%)
Mutual labels:  neural-architecture-search
Single Path One Shot Nas Mxnet
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-20.47%)
Mutual labels:  neural-architecture-search
Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-40.35%)
Mutual labels:  neural-architecture-search
Deep architect
A general, modular, and programmable architecture search framework
Stars: ✭ 110 (-35.67%)
Mutual labels:  neural-architecture-search
Nas Segm Pytorch
Code for Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells, CVPR '19
Stars: ✭ 126 (-26.32%)
Mutual labels:  neural-architecture-search
Robnets
[CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
Stars: ✭ 95 (-44.44%)
Mutual labels:  neural-architecture-search
Scarlet Nas
Bridging the gap Between Stability and Scalability in Neural Architecture Search
Stars: ✭ 140 (-18.13%)
Mutual labels:  neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+594.15%)
Mutual labels:  neural-architecture-search
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-29.82%)
Mutual labels:  neural-architecture-search
Aw nas
aw_nas: A Modularized and Extensible NAS Framework
Stars: ✭ 152 (-11.11%)
Mutual labels:  neural-architecture-search
Deep architect legacy
DeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (-15.79%)
Mutual labels:  neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+963.74%)
Mutual labels:  neural-architecture-search

NSGA-Net

Code accompanying the paper. All codes assume running from root directory. Please update the sys path at the beginning of the codes before running.

NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm

Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, Kalyanmoy Deb, Erik Goodman and Wolfgang Banzhaf

arXiv:1810.03522

overview

Requirements

Python >= 3.6.8, PyTorch >= 1.0.1.post2, torchvision >= 0.2.2, pymoo == 0.3.0

Results on CIFAR-10

cifar10_pareto

Pretrained models on CIFAR-10

The easiest way to get started is to evaluate our pretrained NSGA-Net models.

Macro search space (NSGA-Net-macro)

macro_architecture

python validation/test.py --net_type macro --model_path weights.pt
  • Expected result: 3.73% test error rate with 3.37M model parameters, 1240M Multiply-Adds.

Micro search space

micro_architecture

python validation/test.py --net_type micro --arch NSGANet --init_channels 26 --filter_increment 4 --SE --auxiliary --model_path weights.pt
  • Expected result: 2.43% test error rate with 1.97M model parameters, 417M Multiply-Adds (weights.pt).
python validation/test.py --net_type micro --arch NSGANet --init_channels 34 --filter_increment 4 --auxiliary --model_path weights.pt
  • Expected result: 2.22% test error rate with 2.20M model parameters, 550M Multiply-Adds (weights.pt).
python validation/test.py --net_type micro --arch NSGANet --init_channels 36 --filter_increment 6 --SE --auxiliary --model_path weights.pt
  • Expected result: 2.02% test error rate with 4.05M model parameters, 817M Multiply-Adds (weights.pt).

Pretrained models on CIFAR-100

python validation/test.py --task cifar100 --net_type micro --arch NSGANet --init_channels 36 --filter_increment 6 --SE --auxiliary --model_path weights.pt
  • Expected result: 14.42% test error rate with 4.1M model parameters, 817M Multiply-Adds (weights.pt).

Architecture validation

To validate the results by training from scratch, run

# architecture found from macro search space
python validation/train.py --net_type macro --cutout --batch_size 128 --epochs 350 
# architecture found from micro search space
python validation/train.py --net_type micro --arch NSGANet --layers 20 --init_channels 34 --filter_increment 4  --cutout --auxiliary --batch_size 96 --droprate 0.2 --SE --epochs 600

You may need to adjust the batch_size depending on your GPU memory.

For customized macro search space architectures, change genome and channels option in train.py.

For customized micro search space architectures, specify your architecture in models/micro_genotypes.py and use --arch flag to pass the name.

Architecture search

To run architecture search:

# macro search space
python search/evolution_search.py --search_space macro --init_channels 32 --n_gens 30
# micro search space
python search/evolution_search.py --search_space micro --init_channels 16 --layers 8 --epochs 20 --n_offspring 20 --n_gens 30
Pareto Front Network
Pareto Front Normal Cell Reduction Cell

If you would like to run asynchronous and parallelize each architecture's back-propagation training, set --n_offspring to 1. The algorithm will run in steady-state mode, in which the population is updated as soon as one new architecture candidate is evaludated. It works reasonably well in single-objective case, a similar strategy is used in here.

Visualization

To visualize the architectures:

python visualization/macro_visualize.py NSGANet            # macro search space architectures
python visualization/micro_visualize.py NSGANet            # micro search space architectures

For customized architecture, first define the architecture in models/*_genotypes.py, then substitute NSGANet with the name of your customized architecture.

Citations

If you find the code useful for your research, please consider citing our works

@article{nsganet,
  title={NSGA-NET: a multi-objective genetic algorithm for neural architecture search},
  author={Lu, Zhichao and Whalen, Ian and Boddeti, Vishnu and Dhebar, Yashesh and Deb, Kalyanmoy and Goodman, Erik and  Banzhaf, Wolfgang},
  booktitle={GECCO-2019},
  year={2018}
}

Acknowledgement

Code heavily inspired and modified from pymoo, DARTS and pytorch-cifar10.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].