All Projects → NVIDIA → Milano

NVIDIA / Milano

Licence: apache-2.0
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Milano

mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-75.71%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+4125.71%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (-5.71%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (+41.43%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-16.43%)
Mutual labels:  deep-neural-networks, automl, hyperparameter-optimization
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+57.86%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (+18.57%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-40.71%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+302.86%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
Rl Baselines Zoo
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+499.29%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Mljar Supervised
Automated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+586.43%)
Mutual labels:  automl, hyperparameter-optimization
Tpot
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
Stars: ✭ 8,378 (+5884.29%)
Mutual labels:  automl, hyperparameter-optimization
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+393.57%)
Mutual labels:  automl, hyperparameter-optimization
Mlprimitives
Primitives for machine learning and data science.
Stars: ✭ 46 (-67.14%)
Mutual labels:  automl, hyperparameter-tuning
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+362.86%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Determined
Determined: Deep Learning Training Platform
Stars: ✭ 1,171 (+736.43%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (+746.43%)
Mutual labels:  deep-neural-networks, automl
Mgo
Purely functional genetic algorithms for multi-objective optimisation
Stars: ✭ 63 (-55%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hyperopt Keras Cnn Cifar 100
Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another classification task.
Stars: ✭ 95 (-32.14%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Opentpod
Open Toolkit for Painless Object Detection
Stars: ✭ 106 (-24.29%)
Mutual labels:  deep-neural-networks, automl

License Documentation

Milano

(This is a research project, not an official NVIDIA product.)

Milano

Documentation

https://nvidia.github.io/Milano

Milano (Machine learning autotuner and network optimizer) is a tool for enabling machine learning researchers and practitioners to perform massive hyperparameters and architecture searches.

You can use it to:

Your script can use any framework of your choice, for example, TensorFlow, PyTorch, Microsoft Cognitive Toolkit etc. or no framework at all. Milano only requires minimal changes to what your script accepts via command line and what it returns to stdout.

Currently supported backends:

  • Azkaban - on a single multi-GPU machine or server with Azkaban installed
  • AWS - Amazon cloud using GPU instances
  • SLURM - any cluster which is running SLURM

Prerequisites

  • Linux
  • Python 3
  • Ensure you have Python version 3.5 or later with packages listed in the requirements.txt file.
  • Backend with NVIDIA GPU

How to Get Started

  1. Install all dependencies with the following command pip install -r requirements.txt.
  2. Follow this mini-tutorial for local machine or this mini-tutorial for AWS

Visualize

We provide a script to convert the csv file output into two kinds of graphs:

  • Graphs of each hyperparameter with the benchmark (e.g. valid perplexity)
  • Color graphs that show the relationship between any two hyperparameters and the benchmark

To run the script, use:

python3 visualize.py --file [the name of the results csv file] 
                     --n [the number of samples to visualize]
                     --subplots [the number of subplots to show in a plot]
                     --max [the max value of benchmark you care about]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].