All Projects → gdikov → Hypertunity

gdikov / Hypertunity

Licence: apache-2.0
A toolset for black-box hyperparameter optimisation.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hypertunity

Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+8889.92%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+278.15%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (-85.71%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-11.76%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+4871.43%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+21.01%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (+142.86%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Gpflowopt
Bayesian Optimization using GPflow
Stars: ✭ 229 (+92.44%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+373.95%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+333.61%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-21.85%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Bayeso
Simple, but essential Bayesian optimization package
Stars: ✭ 57 (-52.1%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-81.51%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-32.77%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+102.52%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-71.43%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+52.94%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+66.39%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+283.19%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+497.48%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization

CircleCI Documentation Status GitHub

Why Hypertunity

Hypertunity is a lightweight, high-level library for hyperparameter optimisation. Among others, it supports:

  • Bayesian optimisation by wrapping GPyOpt,
  • external or internal objective function evaluation by a scheduler, also compatible with Slurm,
  • real-time visualisation of results in Tensorboard via the HParams plugin.

For the full set of features refer to the documentation.

Quick start

Define the objective function to optimise. For example, it can take the hyperparameters params as input and return a raw value score as output:

import hypertunity as ht

def foo(**params) -> float:
    # do some very costly computations
    ...
    return score

To define the valid ranges for the values of params we create a Domain object:

domain = ht.Domain({
    "x": [-10., 10.],         # continuous variable within the interval [-10., 10.]
    "y": {"opt1", "opt2"},    # categorical variable from the set {"opt1", "opt2"}
    "z": set(range(4))        # discrete variable from the set {0, 1, 2, 3}
})

Then we set up the optimiser:

bo = ht.BayesianOptimisation(domain=domain)

And we run the optimisation for 10 steps. Each result is used to update the optimiser so that informed domain samples are drawn:

n_steps = 10
for i in range(n_steps):
    samples = bo.run_step(batch_size=2, minimise=True)      # suggest next samples
    evaluations = [foo(**s.as_dict()) for s in samples]     # evaluate foo
    bo.update(samples, evaluations)                         # update the optimiser

Finally, we visualise the results in Tensorboard:

import hypertunity.reports.tensorboard as tb

results = tb.Tensorboard(domain=domain, metrics=["score"], logdir="path/to/logdir")
results.from_history(bo.history)

Even quicker start

A high-level wrapper class Trial allows for seamless parallel optimisation without bothering with scheduling jobs, updating optimisers and logging:

trial = ht.Trial(objective=foo,
                 domain=domain,
                 optimiser="bo",
                 reporter="tensorboard",
                 metrics=["score"])
trial.run(n_steps, batch_size=2, n_parallel=2)

Installation

Using PyPI

To install the base version run:

pip install hypertunity

To use the Tensorboard dashboard, build the docs or run the test suite you will need the following extras:

pip install hypertunity[tensorboard,docs,tests]

From source

Checkout the latest master and install locally:

git clone https://github.com/gdikov/hypertunity.git
cd hypertunity
pip install ./[tensorboard]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].