All Projects → awslabs → syne-tune

awslabs / syne-tune

Licence: other
Large scale and asynchronous Hyperparameter Optimization at your fingertip.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to syne-tune

Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+73.33%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, neural-architecture-search
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+129.52%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-67.62%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+110.48%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, neural-architecture-search
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+391.43%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+2050.48%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
open-box
Generalized and Efficient Blackbox Optimization System.
Stars: ✭ 64 (-39.05%)
Mutual labels:  multi-objective-optimization, bayesian-optimization, hyperparameter-tuning
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (+175.24%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-79.05%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+334.29%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, neural-architecture-search
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+5534.29%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+437.14%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+10088.57%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, neural-architecture-search
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+88.57%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (+79.05%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (+88.57%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Gpflowopt
Bayesian Optimization using GPflow
Stars: ✭ 229 (+118.1%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mltb
Machine Learning Tool Box
Stars: ✭ 25 (-76.19%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+64.76%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-35.24%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning

Syne Tune

Release Python Version License Downloads

This package provides state-of-the-art distributed hyperparameter optimizers (HPO) where trials can be evaluated with several trial backend options (local backend to evaluate trials locally; SageMaker to evaluate trials as separate SageMaker training jobs; a simulation backend to quickly benchmark parallel asynchronous schedulers).

Installing

To install Syne Tune from pip, you can simply do:

pip install 'syne-tune'

This will install a bare-bone version. If you want in addition to install our own Gaussian process based optimizers, Ray Tune or Bore optimizer, you can run pip install 'syne-tune[X]' where X can be

  • gpsearchers: For built-in Gaussian process based optimizers
  • raytune: For Ray Tune optimizers
  • benchmarks: For installing all dependencies required to run all benchmarks
  • extra: For installing all the above
  • bore: For Bore optimizer
  • kde: For KDE optimizer

For instance, pip install 'syne-tune[gpsearchers]' will install Syne Tune along with many built-in Gaussian process optimizers.

To install the latest version from git, run the following:

pip install git+https://github.com/awslabs/syne-tune.git

For local development, we recommend to use the following setup which will enable you to easily test your changes:

pip install --upgrade pip
git clone https://github.com/awslabs/syne-tune.git
cd syne-tune
pip install -e '.[extra]'

To run unit tests, simply run pytest in the root of this repository.

To run all tests whose name begins with test_async_scheduler, you can use the following

pytest -k test_async_scheduler

Getting started

To enable tuning, you have to report metrics from a training script so that they can be communicated later to Syne Tune, this can be accomplished by just calling report(epoch=epoch, loss=loss) as shown in the example bellow:

# train_height.py
import logging
import time

from syne_tune import Reporter
from argparse import ArgumentParser

if __name__ == '__main__':
    root = logging.getLogger()
    root.setLevel(logging.INFO)

    parser = ArgumentParser()
    parser.add_argument('--steps', type=int)
    parser.add_argument('--width', type=float)
    parser.add_argument('--height', type=float)

    args, _ = parser.parse_known_args()
    report = Reporter()

    for step in range(args.steps):
        dummy_score = (0.1 + args.width * step / 100) ** (-1) + args.height * 0.1
        # Feed the score back to Syne Tune.
        report(step=step, mean_loss=dummy_score, epoch=step + 1)
        time.sleep(0.1)

Once you have a script reporting metric, you can launch a tuning as-follow:

from syne_tune import Tuner, StoppingCriterion
from syne_tune.backend import LocalBackend
from syne_tune.config_space import randint
from syne_tune.optimizer.baselines import ASHA

# hyperparameter search space to consider
config_space = {
    'steps': 100,
    'width': randint(1, 20),
    'height': randint(1, 20),
}

tuner = Tuner(
    trial_backend=LocalBackend(entry_point='train_height.py'),
    scheduler=ASHA(
        config_space, metric='mean_loss', resource_attr='epoch', max_t=100,
        search_options={'debug_log': False},
    ),
    stop_criterion=StoppingCriterion(max_wallclock_time=15),
    n_workers=4,  # how many trials are evaluated in parallel
)
tuner.run()

The above example runs ASHA with 4 asynchronous workers on a local machine.

Examples

You will find the following examples in examples/ folder illustrating different functionalities provided by Syne Tune:

FAQ and Tutorials

You can check our FAQ, to learn more about Syne Tune functionalities.

Do you want to know more? Here are a number of tutorials.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].