All Projects → c-bata → Goptuna

c-bata / Goptuna

Licence: mit
Decentralized hyperparameter optimization framework, inspired by Optuna.

Programming Languages

go
31211 projects - #10 most used programming language

Projects that are alternatives of or similar to Goptuna

Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+205.33%)
Mutual labels:  bayesian-optimization
Modal
A modular active learning framework for Python
Stars: ✭ 1,148 (+579.29%)
Mutual labels:  bayesian-optimization
Hypertunity
A toolset for black-box hyperparameter optimisation.
Stars: ✭ 119 (-29.59%)
Mutual labels:  bayesian-optimization
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+233.73%)
Mutual labels:  bayesian-optimization
Bayeso
Simple, but essential Bayesian optimization package
Stars: ✭ 57 (-66.27%)
Mutual labels:  bayesian-optimization
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6230.18%)
Mutual labels:  bayesian-optimization
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+166.27%)
Mutual labels:  bayesian-optimization
Limbo
A lightweight framework for Gaussian processes and Bayesian optimization of black-box functions (C++-11)
Stars: ✭ 157 (-7.1%)
Mutual labels:  bayesian-optimization
Bocs
Bayesian Optimization of Combinatorial Structures
Stars: ✭ 59 (-65.09%)
Mutual labels:  bayesian-optimization
Bopp
BOPP: Bayesian Optimization for Probabilistic Programs
Stars: ✭ 112 (-33.73%)
Mutual labels:  bayesian-optimization
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+3400.59%)
Mutual labels:  bayesian-optimization
Ts Emo
This repository contains the source code for “Thompson sampling efficient multiobjective optimization” (TSEMO).
Stars: ✭ 39 (-76.92%)
Mutual labels:  bayesian-optimization
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-37.28%)
Mutual labels:  bayesian-optimization
Bayesianoptimization
A Python implementation of global optimization with gaussian processes.
Stars: ✭ 5,611 (+3220.12%)
Mutual labels:  bayesian-optimization
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-28.99%)
Mutual labels:  bayesian-optimization
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+169.82%)
Mutual labels:  bayesian-optimization
Bayesian Machine Learning
Notebooks about Bayesian methods for machine learning
Stars: ✭ 1,202 (+611.24%)
Mutual labels:  bayesian-optimization
Bads
Bayesian Adaptive Direct Search (BADS) optimization algorithm for model fitting in MATLAB
Stars: ✭ 159 (-5.92%)
Mutual labels:  bayesian-optimization
Pysot
Surrogate Optimization Toolbox for Python
Stars: ✭ 136 (-19.53%)
Mutual labels:  bayesian-optimization
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-33.73%)
Mutual labels:  bayesian-optimization

Goptuna

Software License GoDoc Go Report Card

Decentralized hyperparameter optimization framework, inspired by Optuna [1]. This library is particularly designed for machine learning, but everything will be able to optimize if you can define the objective function (e.g. Optimizing the number of goroutines of your server and the memory buffer size of the caching systems).

Supported algorithms:

Goptuna supports various state-of-the-art Bayesian optimization, Evolution strategy and Multi-armed bandit algorithms. These algorithms are implemented in pure Go and continuously benchmarked on GitHub Actions.

  • Random search
  • TPE: Tree-structured Parzen Estimators [2]
  • CMA-ES: Covariance Matrix Adaptation Evolution Strategy [3]
  • IPOP-CMA-ES: CMA-ES with increasing population size [4]
  • BIPOP-CMA-ES: BI-population CMA-ES [5]
  • Median Stopping Rule [6]
  • ASHA: Asynchronous Successive Halving Algorithm (Optuna flavored version) [1,7,8]
  • Quasi-monte carlo sampling based on Sobol sequence [10, 11]

Built-in dashboard:

Manage optimization results Interactive live-updating graphs
state-of-the-art-algorithms visualization

Projects using Goptuna:

Installation

You can integrate Goptuna in wide variety of Go projects because of its portability of pure Go.

$ go get -u github.com/c-bata/goptuna

Usage

Goptuna supports Define-by-Run style API like Optuna. You can dynamically construct the search spaces.

Basic usage

package main

import (
    "log"
    "math"

    "github.com/c-bata/goptuna"
    "github.com/c-bata/goptuna/tpe"
)

// ① Define an objective function which returns a value you want to minimize.
func objective(trial goptuna.Trial) (float64, error) {
    // ② Define the search space via Suggest APIs.
    x1, _ := trial.SuggestFloat("x1", -10, 10)
    x2, _ := trial.SuggestFloat("x2", -10, 10)
    return math.Pow(x1-2, 2) + math.Pow(x2+5, 2), nil
}

func main() {
    // ③ Create a study which manages each experiment.
    study, err := goptuna.CreateStudy(
        "goptuna-example",
        goptuna.StudyOptionSampler(tpe.NewSampler()))
    if err != nil { ... }

    // ④ Evaluate your objective function.
    err = study.Optimize(objective, 100)
    if err != nil { ... }

    // ⑤ Print the best evaluation parameters.
    v, _ := study.GetBestValue()
    p, _ := study.GetBestParams()
    log.Printf("Best value=%f (x1=%f, x2=%f)",
        v, p["x1"].(float64), p["x2"].(float64))
}

Link: Go Playground

Furthermore, I recommend you to use RDB storage backend for following purposes.

  • Continue from where we stopped in the previous optimizations.
  • Scale studies to tens of workers that connecting to the same RDB storage.
  • Check optimization results via built-in dashboard.

Advanced usage

Distributed optimization using MySQL

There is no complicated setup to use RDB storage backend. First, setup MySQL server like following to share the optimization result.

$ docker pull mysql:8.0
$ docker run \
  -d \
  --rm \
  -p 3306:3306 \
  -e MYSQL_USER=goptuna \
  -e MYSQL_DATABASE=goptuna \
  -e MYSQL_PASSWORD=password \
  -e MYSQL_ALLOW_EMPTY_PASSWORD=yes \
  --name goptuna-mysql \
  mysql:8.0

Then, create a study object using goptuna CLI.

$ goptuna create-study --storage mysql://goptuna:[email protected]:3306/yourdb --study yourstudy
yourstudy
$ mysql --host 127.0.0.1 --port 3306 --user goptuna -ppassword -e "SELECT * FROM studies;"
+----------+------------+-----------+
| study_id | study_name | direction |
+----------+------------+-----------+
|        1 | yourstudy  | MINIMIZE  |
+----------+------------+-----------+
1 row in set (0.00 sec)

Finally, run the Goptuna workers which contains following code. You can execute distributed optimization by just executing this script from multiple server instances.

package main

import ...

func main() {
    db, _ := gorm.Open(mysql.Open("goptuna:[email protected](localhost:3306)/yourdb?parseTime=true"), &gorm.Config{
        Logger: logger.Default.LogMode(logger.Silent),
    })
    storage := rdb.NewStorage(db)
    defer db.Close()

    study, _ := goptuna.LoadStudy(
        "yourstudy",
        goptuna.StudyOptionStorage(storage),
        ...,
    )
    _ = study.Optimize(objective, 50)
    ...
}

Full source code is available here.

Built-in Realtime Web Dashboard

You can check optimization results by built-in web dashboard.

SQLite3:

$ goptuna dashboard --storage sqlite:///example.db

MySQL:

$ goptuna dashboard --storage mysql://goptuna:[email protected]:3306/yourdb

goptuna dashboard

Shell script to reproduce this (SQLite3 version is here).

Links

References:

Presentations:

Blog posts:

Status:

License

This software is licensed under the MIT license, see LICENSE for more information.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].