All Projects → zygmuntz → Hyperband

zygmuntz / Hyperband

Licence: other
Tuning hyperparams fast with Hyperband

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hyperband

mlr3tuning
Hyperparameter optimization package of the mlr3 ecosystem
Stars: ✭ 44 (-92.07%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-85.05%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-56.58%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Neuraxle
A Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
Stars: ✭ 377 (-32.07%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
polystores
A library for performing hyperparameter optimization
Stars: ✭ 48 (-91.35%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-60.18%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-96.04%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (-70.99%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
bbopt
Black box hyperparameter optimization made easy.
Stars: ✭ 66 (-88.11%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-81.08%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (-64.32%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (-47.93%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-66.13%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-87.75%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (-70.09%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
naturalselection
A general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (-96.94%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Pbt
Population Based Training (in PyTorch with sqlite3). Status: Unsupported
Stars: ✭ 138 (-75.14%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Milano
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.
Stars: ✭ 140 (-74.77%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
mltb
Machine Learning Tool Box
Stars: ✭ 25 (-95.5%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-93.87%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning

hyperband

Code for tuning hyperparams with Hyperband, adapted from Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization.

defs/ - functions and search space definitions for various classifiers
defs_regression/ - the same for regression models
common_defs.py - imports and definitions shared by defs files
hyperband.py - from hyperband import Hyperband

load_data.py - classification defs import data from this file
load_data_regression.py - regression defs import data from this file

main.py - a complete example for classification
main_regression.py - the same, for regression
main_simple.py - a simple, bare-bones, example	

The goal is to provide a fully functional implementation of Hyperband, as well as a number of ready to use functions for a number of models (classifiers and regressors). Currently these include four from scikit-learn and four others:

  • gradient boosting (GB)
  • random forest (RF)
  • extremely randomized trees (XT)
  • linear SGD
  • factorization machines from polylearn
  • polynomial networks from polylearn
  • a multilayer perceptron from Keras
  • gradient boosting from XGBoost (classification only)

Meta-classifier/regressor

Use defs.meta/defs_regression.meta to try many models in one Hyperband run. This is an automatic alternative to constructing search spaces with multiple models (like defs.rf_xt, or defs.polylearn_fm_pn) by hand.

Loading data

Definitions files in defs/defs_regression import data from load_data.py and load_data_regression.py, respectively.

Edit these files, or a definitions file directly, to make your data available for tuning.

Regression defs use the kin8nm dataset in data/kin8nm. There is no attached data for classification.

For the provided models data format follows scikit-learn conventions, that is, there are x_train, y_train, x_test and y_test Numpy arrays.

Usage

Run main.py (with your own data), or main_regression.py. The essence of it is

from hyperband import Hyperband
from defs.gb import get_params, try_params

hb = Hyperband( get_params, try_params )
results = hb.run()

Here's a sample output from a run (three configurations tested) using defs.xt:

3 | Tue Feb 28 15:39:54 2017 | best so far: 0.5777 (run 2)

n_estimators: 5
{'bootstrap': False,
'class_weight': 'balanced',
'criterion': 'entropy',
'max_depth': 5,
'max_features': 'sqrt',
'min_samples_leaf': 5,
'min_samples_split': 6}

# training | log loss: 62.21%, AUC: 75.25%, accuracy: 67.20%
# testing  | log loss: 62.64%, AUC: 74.81%, accuracy: 66.78%

7 seconds.

4 | Tue Feb 28 15:40:01 2017 | best so far: 0.5777 (run 2)

n_estimators: 5
{'bootstrap': False,
'class_weight': None,
'criterion': 'gini',
'max_depth': 5,
'max_features': 'sqrt',
'min_samples_leaf': 1,
'min_samples_split': 2}

# training | log loss: 53.39%, AUC: 75.69%, accuracy: 72.37%
# testing  | log loss: 53.96%, AUC: 75.29%, accuracy: 71.89%

7 seconds.

5 | Tue Feb 28 15:40:07 2017 | best so far: 0.5396 (run 4)

n_estimators: 5
{'bootstrap': True,
'class_weight': None,
'criterion': 'gini',
'max_depth': 3,
'max_features': None,
'min_samples_leaf': 7,
'min_samples_split': 8}

# training | log loss: 50.20%, AUC: 77.04%, accuracy: 75.39%
# testing  | log loss: 50.67%, AUC: 76.77%, accuracy: 75.12%

8 seconds.

Early stopping

Some models may use early stopping (as the Keras MLP example does). If a configuration stopped early, it doesn't make sense to run it with more iterations (duh). To indicate this, make try_params()

return { 'loss': loss, 'early_stop': True }

This way, Hyperband will know not to select that configuration for any further runs.

Moar

See http://fastml.com/tuning-hyperparams-fast-with-hyperband/ for a detailed description.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].