All Projects → d4l3k → go-bayesopt

d4l3k / go-bayesopt

Licence: MIT license
A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.

Programming Languages

go
31211 projects - #10 most used programming language

Projects that are alternatives of or similar to go-bayesopt

Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+4704.26%)
Mutual labels:  hyperparameter-optimization, bayesopt
Bayesian Optimization
Python code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (+421.28%)
Mutual labels:  hyperparameter-optimization, gaussian-processes
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+412.77%)
Mutual labels:  hyperparameter-optimization, gaussian-processes
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+321.28%)
Mutual labels:  hyperparameter-optimization, gaussian-processes
hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (+70.21%)
Mutual labels:  hyperparameter-optimization, gaussian-processes
Btb
A simple, extensible library for developing AutoML systems
Stars: ✭ 159 (+238.3%)
Mutual labels:  hyperparameter-optimization, gaussian-processes
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+287.23%)
Mutual labels:  hyperparameter-optimization
TemporalGPs.jl
Fast inference for Gaussian processes in problems involving time. Partly built on results from https://proceedings.mlr.press/v161/tebbutt21a.html
Stars: ✭ 89 (+89.36%)
Mutual labels:  gaussian-processes
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+268.09%)
Mutual labels:  hyperparameter-optimization
boundary-gp
Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Stars: ✭ 21 (-55.32%)
Mutual labels:  gaussian-processes
Ray
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+39361.7%)
Mutual labels:  hyperparameter-optimization
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (+242.55%)
Mutual labels:  hyperparameter-optimization
Orion
Asynchronous Distributed Hyperparameter Optimization.
Stars: ✭ 186 (+295.74%)
Mutual labels:  hyperparameter-optimization
scicloj.ml
A Clojure machine learning library
Stars: ✭ 152 (+223.4%)
Mutual labels:  hyperparameter-optimization
Hyperas
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Stars: ✭ 2,110 (+4389.36%)
Mutual labels:  hyperparameter-optimization
approxposterior
A Python package for approximate Bayesian inference and optimization using Gaussian processes
Stars: ✭ 36 (-23.4%)
Mutual labels:  gaussian-processes
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (+253.19%)
Mutual labels:  hyperparameter-optimization
Gpflowopt
Bayesian Optimization using GPflow
Stars: ✭ 229 (+387.23%)
Mutual labels:  hyperparameter-optimization
pyrff
pyrff: Python implementation of random fourier feature approximations for gaussian processes
Stars: ✭ 24 (-48.94%)
Mutual labels:  gaussian-processes
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (+321.28%)
Mutual labels:  hyperparameter-optimization

go-bayesopt Build Status GoDoc

A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.

This project is under active development, if you find a bug, or anything that needs correction, please let me know.

Simple Example

package main

import (
  "log"
  "math"

  "github.com/d4l3k/go-bayesopt"
)

func main() {
  X := bayesopt.UniformParam{
    Max: 10,
    Min: -10,
  }
  o := bayesopt.New(
    []Param{
      X,
    },
  )
  // minimize x^2+1
  x, y, err := o.Optimize(func(params map[Param]float64) float64 {
    return math.Pow(params[X], 2) + 1
  })
  if err != nil {
    log.Fatal(err)
  }
  log.Println(x, y)
}

How does it work?

From https://github.com/fmfn/BayesianOptimization:

Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.

BayesianOptimization in action

As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).

BayesianOptimization in action

This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.

License

go-bayesopt is licensed under the MIT license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].