All Projects → scikit-optimize → Scikit Optimize

scikit-optimize / Scikit Optimize

Licence: bsd-3-clause
Sequential model-based optimization with a `scipy.optimize` interface

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects
Makefile
30231 projects

Projects that are alternatives of or similar to Scikit Optimize

Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+162%)
Mutual labels:  bayesian-optimization, scikit-learn, hyperparameter-optimization, hyperparameter-tuning, hyperparameter-search
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (-77.15%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization, hyperparameter-tuning
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (-71.3%)
Mutual labels:  optimization, scikit-learn, hyperparameter-optimization, hyperparameter-tuning
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (-91.94%)
Mutual labels:  bayesian-optimization, optimization, scikit-learn, hyperparameter-optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (-93.62%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-95.04%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-98.49%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (-87.2%)
Mutual labels:  bayesian-optimization, hyperparameter-optimization, hyperparameter-tuning
Neuraxle
A Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
Stars: ✭ 377 (-83.3%)
Mutual labels:  scikit-learn, hyperparameter-optimization, hyperparameter-tuning
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (-80.07%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (-92.87%)
Mutual labels:  optimization, hyperparameter-optimization, hyperparameter-tuning
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-95.35%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-95.88%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
polystores
A library for performing hyperparameter optimization
Stars: ✭ 48 (-97.87%)
Mutual labels:  scikit-learn, hyperparameter-optimization, hyperparameter-tuning
osprey
🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (-96.86%)
Mutual labels:  optimization, scikit-learn, hyperparameter-optimization
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-96.32%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, hyperparameter-search
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (-92.34%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-89.33%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-99.03%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (-75.02%)
Mutual labels:  bayesian-optimization, hyperparameter-optimization, hyperparameter-tuning

Logo

pypi conda Travis Status CircleCI Status binder gitter Zenodo DOI

Scikit-Optimize

Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt aims to be accessible and easy to use in many contexts.

The library is built on top of NumPy, SciPy and Scikit-Learn.

We do not perform gradient-based optimization. For gradient-based optimization algorithms look at scipy.optimize here.

Approximated objective

Approximated objective function after 50 iterations of gp_minimize. Plot made using skopt.plots.plot_objective.

Important links

Install

scikit-optimize requires

  • Python >= 3.6
  • NumPy (>= 1.13.3)
  • SciPy (>= 0.19.1)
  • joblib (>= 0.11)
  • scikit-learn >= 0.20
  • matplotlib >= 2.0.0

You can install the latest release with:

pip install scikit-optimize

This installs an essential version of scikit-optimize. To install scikit-optimize with plotting functionality, you can instead do:

pip install 'scikit-optimize[plots]'

This will install matplotlib along with scikit-optimize.

In addition there is a conda-forge package of scikit-optimize:

conda install -c conda-forge scikit-optimize

Using conda-forge is probably the easiest way to install scikit-optimize on Windows.

Getting started

Find the minimum of the noisy function f(x) over the range -2 < x < 2 with skopt:

import numpy as np
from skopt import gp_minimize

def f(x):
    return (np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) +
            np.random.randn() * 0.1)

res = gp_minimize(f, [(-2.0, 2.0)])

For more control over the optimization loop you can use the skopt.Optimizer class:

from skopt import Optimizer

opt = Optimizer([(-2.0, 2.0)])

for i in range(20):
    suggested = opt.ask()
    y = f(suggested)
    opt.tell(suggested, y)
    print('iteration:', i, suggested, y)

Read our introduction to bayesian optimization and the other examples.

Development

The library is still experimental and under heavy development. Checkout the next milestone for the plans for the next release or look at some easy issues to get started contributing.

The development version can be installed through:

git clone https://github.com/scikit-optimize/scikit-optimize.git
cd scikit-optimize
pip install -e.

Run all tests by executing pytest in the top level directory.

To only run the subset of tests with short run time, you can use pytest -m 'fast_test' (pytest -m 'slow_test' is also possible). To exclude all slow running tests try pytest -m 'not slow_test'.

This is implemented using pytest attributes. If a tests runs longer than 1 second, it is marked as slow, else as fast.

All contributors are welcome!

Making a Release

The release procedure is almost completely automated. By tagging a new release travis will build all required packages and push them to PyPI. To make a release create a new issue and work through the following checklist:

  • update the version tag in __init__.py
  • update the version tag mentioned in the README
  • check if the dependencies in setup.py are valid or need unpinning
  • check that the doc/whats_new/v0.X.rst is up to date
  • did the last build of master succeed?
  • create a new release
  • ping conda-forge

Before making a release we usually create a release candidate. If the next release is v0.X then the release candidate should be tagged v0.Xrc1 in __init__.py. Mark a release candidate as a "pre-release" on GitHub when you tag it.

Commercial support

Feel free to get in touch if you need commercial support or would like to sponsor development. Resources go towards paying for additional work by seasoned engineers and researchers.

Made possible by

The scikit-optimize project was made possible with the support of

Wild Tree Tech NYU Center for Data Science NSF Northrop Grumman

If your employer allows you to work on scikit-optimize during the day and would like recognition, feel free to add them to the "Made possible by" list.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].