All Projects → microsoft → Petridishnn

microsoft / Petridishnn

Licence: mit
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Petridishnn

Efficientnas
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-60.71%)
Mutual labels:  image-classification, automl, neural-architecture-search
Aognet
Code for CVPR 2019 paper: " Learning Deep Compositional Grammatical Architectures for Visual Recognition"
Stars: ✭ 132 (+17.86%)
Mutual labels:  image-classification, imagenet, cifar10
Naszilla
Naszilla is a Python library for neural architecture search (NAS)
Stars: ✭ 181 (+61.61%)
Mutual labels:  automl, neural-architecture-search, cifar10
Torchdistill
PyTorch-based modular, configuration-driven framework for knowledge distillation. 🏆18 methods including SOTA are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy.
Stars: ✭ 177 (+58.04%)
Mutual labels:  image-classification, imagenet, cifar10
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+175.89%)
Mutual labels:  automl, imagenet, neural-architecture-search
Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-8.93%)
Mutual labels:  automl, imagenet, neural-architecture-search
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (+6.25%)
Mutual labels:  image-classification, automl, neural-architecture-search
BottleneckTransformers
Bottleneck Transformers for Visual Recognition
Stars: ✭ 231 (+106.25%)
Mutual labels:  imagenet, image-classification, cifar10
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+3400%)
Mutual labels:  image-classification, automl, neural-architecture-search
Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+2991.96%)
Mutual labels:  image-classification, automl, neural-architecture-search
Neural Backed Decision Trees
Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet
Stars: ✭ 411 (+266.96%)
Mutual labels:  image-classification, imagenet, cifar10
Image classification cifar 10
Image Classification on CIFAR-10 Dataset using Multi Layer Perceptrons in Python from Scratch.
Stars: ✭ 18 (-83.93%)
Mutual labels:  image-classification, cifar10
Orange3 Imageanalytics
🍊 🎑 Orange3 add-on for dealing with image related tasks
Stars: ✭ 24 (-78.57%)
Mutual labels:  image-classification, imagenet
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+725.89%)
Mutual labels:  automl, neural-architecture-search
Pytorch image classification
PyTorch implementation of image classification models for CIFAR-10/CIFAR-100/MNIST/FashionMNIST/Kuzushiji-MNIST/ImageNet
Stars: ✭ 795 (+609.82%)
Mutual labels:  imagenet, cifar10
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+736.61%)
Mutual labels:  automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+9451.79%)
Mutual labels:  automl, neural-architecture-search
Divide And Co Training
[Paper 2020] Towards Better Accuracy-efficiency Trade-offs: Divide and Co-training. Plus, an image classification toolbox includes ResNet, Wide-ResNet, ResNeXt, ResNeSt, ResNeXSt, SENet, Shake-Shake, DenseNet, PyramidNet, and EfficientNet.
Stars: ✭ 54 (-51.79%)
Mutual labels:  image-classification, imagenet
Imagenet
Trial on kaggle imagenet object localization by yolo v3 in google cloud
Stars: ✭ 56 (-50%)
Mutual labels:  image-classification, imagenet
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+516.96%)
Mutual labels:  automl, neural-architecture-search

Project Petridish: Efficient Forward Architecture Search


WARNING DEPRECATED! DEPRECATED! DEPRECATED! Much higher quality, performant code for Petridish is now available here This repository is not maintained or supported anymore.

Code for Efficient Forward Neural Architecture Search, Neurips 2019!

Note this repo is under active development and the code base is expected to rapidly change. We are currently rewriting Petridish in Pytorch with evaluation on many more datasets and pretrained models. It will appear here shortly.

Conduct and Privacy

Petridishnn has adopted the Microsoft Open Source Code of Conduct. For more information on this code of conduct, see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments. Read Microsoft’s statement on Privacy & Cookies

Installation on development machine

We have developed and tested Petridish on Ubuntu 16.04 LTS (64-bit), Anaconda python distribution and Tensorflow.

Installing the software

  1. Install Anaconda python distribution for Ubuntu
  2. Create a python 3.6 environment conda create python=3.6 -n py36
  3. Follow instructions to install a recent Tensorflow (TF) version. 1.12 is tested.
  4. Clone the repo: git clone petridishnn
  5. Install dependency packages python -m pip install -r <path_to_petridishnn>/requirements.txt
  6. Petridish needs some environment variables: GLOBAL_LOG_DIR: directory where logs will be written to by jobs running locally. GLOBAL_MODEL_DIR: directory where models will be written to by jobs running locally. GLOBAL_DATA_DIR: directory from where local jobs will read data. Set them to appropriate values in your bashrc. E.g. export GLOBAL_MODEL_DIR="/home/dedey/data"

Getting the data

Petridish code assumes datasets are in certain format (e.g. we transform ImageNet raw data to lmdb format). While one can always download the raw data of standard datasets and use the relevant scripts in petridishnn/petridish/data to convert them Debadeepta Dey [email protected] maintains an Azure blob with all the data in the converted format. (For Microsoft employees only) Please email him for access.

Running a sample search job on cifar

Before doing full scale search on Azure it is common to check everything is running on local machine. An example job script is at petridishnn/scripts/test_distributed.sh. Make sure you have all the environment variables used in this script. Run this from root folder of petridishn as bash scripts/test_distributed.sh. This will output somethings to stdout but will output models and logs to the corresponding folders. If this succeeds you have a working installation. Yay!

Post-search Analysis

We provide a number of scripts to analyze and post-process the search results in the directory petridish/analysis. We also provide a script to generate training scripts to train the found models. We list them in the order of usage as follows. Please refer to the header of each linked file for usage.

  1. Inspect the search log
  2. Generate scripts to train found models
  3. Check performance of model training

Contacts:

Contributing

Please read the contributing policy

Bibtex

If you would like to use this work for your research, please cite the following:

@article{hu2019forwardnas,
  title={Efficient Forward Architecture Search},
  author={Hanzhang Hu and John Langford and Rich Caruana and Saurajit Mukherjee and Eric Horvitz and Debadeepta Dey},
  journal={Neural Information Processing Systems},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].