All Projects → arberzela → Efficientnas

arberzela / Efficientnas

Licence: mit
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Efficientnas

Naszilla
Naszilla is a Python library for neural architecture search (NAS)
Stars: ✭ 181 (+311.36%)
Mutual labels:  convolutional-neural-networks, automl, neural-architecture-search
Pba
Efficient Learning of Augmentation Policy Schedules
Stars: ✭ 461 (+947.73%)
Mutual labels:  convolutional-neural-networks, image-classification, automl
Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+7770.45%)
Mutual labels:  image-classification, automl, neural-architecture-search
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (+170.45%)
Mutual labels:  image-classification, automl, neural-architecture-search
Fast Autoaugment
Official Implementation of 'Fast AutoAugment' in PyTorch.
Stars: ✭ 1,297 (+2847.73%)
Mutual labels:  convolutional-neural-networks, image-classification, automl
Mtlnas
[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (+31.82%)
Mutual labels:  convolutional-neural-networks, automl, neural-architecture-search
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (+154.55%)
Mutual labels:  image-classification, automl, neural-architecture-search
Autoclint
A specially designed light version of Fast AutoAugment
Stars: ✭ 171 (+288.64%)
Mutual labels:  convolutional-neural-networks, image-classification, automl
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+8809.09%)
Mutual labels:  image-classification, automl, neural-architecture-search
Artificio
Deep Learning Computer Vision Algorithms for Real-World Use
Stars: ✭ 326 (+640.91%)
Mutual labels:  convolutional-neural-networks, image-classification
Trashnet
Dataset of images of trash; Torch-based CNN for garbage image classification
Stars: ✭ 368 (+736.36%)
Mutual labels:  convolutional-neural-networks, image-classification
Computer Vision
Programming Assignments and Lectures for Stanford's CS 231: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 408 (+827.27%)
Mutual labels:  convolutional-neural-networks, image-classification
Adanet
Fast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+7490.91%)
Mutual labels:  automl, neural-architecture-search
Assembled Cnn
Tensorflow implementation of "Compounding the Performance Improvements of Assembled Techniques in a Convolutional Neural Network"
Stars: ✭ 319 (+625%)
Mutual labels:  convolutional-neural-networks, image-classification
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (+752.27%)
Mutual labels:  convolutional-neural-networks, image-classification
Tensorflow 101
TensorFlow 101: Introduction to Deep Learning for Python Within TensorFlow
Stars: ✭ 642 (+1359.09%)
Mutual labels:  convolutional-neural-networks, automl
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+602.27%)
Mutual labels:  automl, neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+936.36%)
Mutual labels:  automl, neural-architecture-search
Computervision Recipes
Best Practices, code samples, and documentation for Computer Vision.
Stars: ✭ 8,214 (+18568.18%)
Mutual labels:  convolutional-neural-networks, image-classification
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+2002.27%)
Mutual labels:  automl, neural-architecture-search

EfficientNAS

Code for the paper

Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search
Arber Zela, Aaron Klein, Stefan Falkner and Frank Hutter.
arXiv:1807.06906.

This is a follow-up work of BOHB: Robust and Efficient Hyperparameter Optimization at Scale. We use BOHB to conduct an analysis over a joint neural architecture and hyperparameter space and demostrate the weak correlation accross training budgets far from each other. Nevertheless, our search method surprisingly finds a configuration able to achieve 3.18% test error in just 3h of training.

Requirements

Python >= 3.6.x, PyTorch == 0.3.1, torchvision == 0.2.0, hpbandster, ConfigSpace

Running the joint search

The code is only compatible with CIFAR-10, which will be automatically downloaded, however it can be easily extended to other image datasets with the same resolution, such as CIFAR-100, SVHN, etc.

For starting BOHB one has to specify 5 parameters: min_budget, max_budget, eta, num_iterations and num_workers. You can change them in the script BOHB-CIFAR10.sh.\

NOTE: We used the Slurm Workload Manager environment to run our jobs, but it can be easily adapted to other job scheduling systems.

To start the search with the default settings (min_budget=400, max_budget=10800, eta =3, num_iterations=32, num_workers=10) used in the paper just run:

sbatch BOHB-CIFAR10.sh

Citation

@inproceedings{zela-automl18,
  author    = {Arber Zela and
               Aaron Klein and
               Stefan Falkner and 
               Frank Hutter},
  title     = {Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search},
  booktitle = {ICML 2018 AutoML Workshop},
  year      = {2018},
  month     = jul,
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].