All Projects → negrinho → Deep_architect_legacy

negrinho / Deep_architect_legacy

Licence: mit
DeepArchitect: Automatically Designing and Training Deep Architectures

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep architect legacy

Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+53.47%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+26.39%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-18.75%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+379.86%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+2622.22%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-27.08%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Deep architect
A general, modular, and programmable architecture search framework
Stars: ✭ 110 (-23.61%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+2120.83%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+216.67%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+7329.17%)
Mutual labels:  hyperparameter-optimization, neural-architecture-search
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-22.22%)
Mutual labels:  hyperparameter-optimization
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-12.5%)
Mutual labels:  neural-architecture-search
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-22.22%)
Mutual labels:  neural-architecture-search
Nas Segm Pytorch
Code for Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells, CVPR '19
Stars: ✭ 126 (-12.5%)
Mutual labels:  neural-architecture-search
Sgas
SGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-4.86%)
Mutual labels:  neural-architecture-search
Graphnas
This directory contains code necessary to run the GraphNAS algorithm.
Stars: ✭ 104 (-27.78%)
Mutual labels:  neural-architecture-search
Talos
Hyperparameter Optimization for TensorFlow, Keras and PyTorch
Stars: ✭ 1,382 (+859.72%)
Mutual labels:  hyperparameter-optimization
Scarlet Nas
Bridging the gap Between Stability and Scalability in Neural Architecture Search
Stars: ✭ 140 (-2.78%)
Mutual labels:  neural-architecture-search
Single Path One Shot Nas Mxnet
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-5.56%)
Mutual labels:  neural-architecture-search
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-16.67%)
Mutual labels:  neural-architecture-search

DeepArchitect: Automatically Designing and Training Deep Architectures

IMPORTANT: This repo is not under active development. It contains a prototype for the ideas described in this paper. See our NeurIPS 2019 paper for the latest developments. The code and documentation for the latest framework can be found here.

This repository contains a Python implementation of the DeepArchitect framework described in our paper. To get familiar with the framework, we recommend starting with this tutorial.

A tar file with the logs of the experiments in the paper is available here. You can download it, unzip it in the top folder of the repo, and generate the plots of the paper using plots.py. The logs are composed of text and pickle files. It may be informative to inspect them. The experiments reported in the paper can be reproduced using experiments.py.

Contributors: Renato Negrinho, Geoff Gordon, Matt Gormley, Christoph Dann, Matt Barnes.

References

@article{negrinho2017deeparchitect,
  title={Deeparchitect: Automatically designing and training deep architectures},
  author={Negrinho, Renato and Gordon, Geoff},
  journal={arXiv preprint arXiv:1704.08792},
  year={2017}
}

@article{negrinho2019towards,
  title={Towards modular and programmable architecture search},
  author={Negrinho, Renato and Patil, Darshan and Le, Nghia and Ferreira, Daniel and Gormley, Matthew and Gordon, Geoffrey},
  journal={Neural Information Processing Systems},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].