All Projects → deephyper → Deephyper

deephyper / Deephyper

Licence: other
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deephyper

Deep architect
A general, modular, and programmable architecture search framework
Stars: ✭ 110 (-5.98%)
Mutual labels:  deep-neural-networks, neural-networks, hyperparameter-optimization, neural-architecture-search
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (+1.71%)
Mutual labels:  neural-networks, automl, neural-architecture-search
Lightwood
Lightwood is Legos for Machine Learning.
Stars: ✭ 115 (-1.71%)
Mutual labels:  neural-networks, ml, automl
Milano
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.
Stars: ✭ 140 (+19.66%)
Mutual labels:  deep-neural-networks, automl, hyperparameter-optimization
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (+41.88%)
Mutual labels:  neural-networks, automl, hyperparameter-optimization
Dltk
Deep Learning Toolkit for Medical Image Analysis
Stars: ✭ 1,249 (+967.52%)
Mutual labels:  deep-neural-networks, neural-networks, ml
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+9043.59%)
Mutual labels:  automl, hyperparameter-optimization, neural-architecture-search
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (+12.82%)
Mutual labels:  ml, automl, hyperparameter-optimization
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+2633.33%)
Mutual labels:  automl, hyperparameter-optimization, neural-architecture-search
Awesome Distributed Deep Learning
A curated list of awesome Distributed Deep Learning resources.
Stars: ✭ 277 (+136.75%)
Mutual labels:  deep-neural-networks, neural-networks, hyperparameter-optimization
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+3250.43%)
Mutual labels:  automl, hyperparameter-optimization, neural-architecture-search
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+88.89%)
Mutual labels:  hyperparameter-optimization, automl, neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+289.74%)
Mutual labels:  automl, hyperparameter-optimization, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+490.6%)
Mutual labels:  automl, hyperparameter-optimization, neural-architecture-search
Caffe2
Caffe2 is a lightweight, modular, and scalable deep learning framework.
Stars: ✭ 8,409 (+7087.18%)
Mutual labels:  deep-neural-networks, ml
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+914.53%)
Mutual labels:  automl, neural-architecture-search
Dann
Deep Neural Network Sandbox for JavaScript.
Stars: ✭ 75 (-35.9%)
Mutual labels:  deep-neural-networks, neural-networks
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (+912.82%)
Mutual labels:  deep-neural-networks, automl
Niftynet
[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
Stars: ✭ 1,276 (+990.6%)
Mutual labels:  deep-neural-networks, ml
Codesearchnet
Datasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+1077.78%)
Mutual labels:  neural-networks, ml

GitHub tag (latest by date) Build Status Documentation Status PyPI - License PyPI - Downloads

What is DeepHyper?

DeepHyper is an automated machine learning (AutoML) package for deep neural networks. It comprises two components: 1) Neural architecture search is an approach for automatically searching for high-performing the deep neural network search_space. 2) Hyperparameter search is an approach for automatically searching for high-performing hyperparameters for a given deep neural network. DeepHyper provides an infrastructure that targets experimental research in neural architecture and hyperparameter search methods, scalability, and portability across HPC systems. It comprises three modules: benchmarks, a collection of extensible and diverse benchmark problems; search, a set of search algorithms for neural architecture search and hyperparameter search; and evaluators, a common interface for evaluating hyperparameter configurations on HPC platforms.

Documentation

Deephyper documentation is on ReadTheDocs

Install instructions

From pip:

pip install deephyper

From github:

git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e .

if you want to install deephyper with test and documentation packages:

# From Pypi
pip install 'deephyper[tests,docs]'

# From github
git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e '.[tests,docs]'

Directory search_space

benchmark/
    a set of problems for hyperparameter or neural architecture search which the user can use to compare our different search algorithms or as examples to build their own problems.
evaluator/
    a set of objects which help to run search on different systems and for different cases such as quick and light experiments or long and heavy runs.
search/
    a set of algorithms for hyperparameter and neural architecture search. You will also find a modular way to define new search algorithms and specific sub modules for hyperparameter or neural architecture search.
hps/
        hyperparameter search applications
nas/
        neural architecture search applications

How do I learn more?

Quickstart

Hyperparameter Search (HPS)

An example command line for HPS:

deephyper hps ambs --evaluator ray --problem deephyper.benchmark.hps.polynome2.Problem --run deephyper.benchmark.hps.polynome2.run --n-jobs 1

Neural Architecture Search (NAS)

An example command line for NAS:

deephyper nas ambs --evaluator ray --problem deephyper.benchmark.nas.polynome2Reg.Problem --n-jobs 1

Who is responsible?

Currently, the core DeepHyper team is at Argonne National Laboratory:

Modules, patches (code, documentation, etc.) contributed by:

Citing DeepHyper

If you are referencing DeepHyper in a publication, please cite the following papers:

  • P. Balaprakash, M. Salim, T. Uram, V. Vishwanath, and S. M. Wild. DeepHyper: Asynchronous Hyperparameter Search for Deep Neural Networks. In 25th IEEE International Conference on High Performance Computing, Data, and Analytics. IEEE, 2018.

  • P. Balaprakash, R. Egele, M. Salim, S. Wild, V. Vishwanath, F. Xia, T. Brettin, and R. Stevens. Scalable reinforcement-learning-based neural architecture search for cancer deep learning research. In SC ’19: IEEE/ACM International Conference on High Performance Computing, Network-ing, Storage and Analysis, 2019.

How can I participate?

Questions, comments, feature requests, bug reports, etc. can be directed to:

  • Issues on GitHub

Patches through pull requests are much appreciated on the software itself as well as documentation. Optionally, please include in your first patch a credit for yourself in the list above.

The DeepHyper Team uses git-flow to organize the development: Git-Flow cheatsheet. For tests we are using: Pytest.

Acknowledgements

  • Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
  • Argonne Leadership Computing Facility: This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
  • SLIK-D: Scalable Machine Learning Infrastructures for Knowledge Discovery, Argonne Computing, Environment and Life Sciences (CELS) Laboratory Directed Research and Development (LDRD) Program (2016--2018)

Copyright and license

Copyright © 2019, UChicago Argonne, LLC

DeepHyper is distributed under the terms of BSD License. See LICENSE

Argonne Patent & Intellectual Property File Number: SF-19-007

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].