All Projects → optuna → Optuna

optuna / Optuna

Licence: mit
A hyperparameter optimization framework

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Optuna

Ray
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+226.59%)
Mutual labels:  parallel, hyperparameter-optimization, distributed
optuna-examples
Examples for https://github.com/optuna/optuna
Stars: ✭ 238 (-95.81%)
Mutual labels:  parallel, distributed, hyperparameter-optimization
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+88.38%)
Mutual labels:  hyperparameter-optimization, distributed
Xyzpy
Efficiently generate and analyse high dimensional data.
Stars: ✭ 45 (-99.21%)
Mutual labels:  parallel, distributed
pooljs
Browser computing unleashed!
Stars: ✭ 17 (-99.7%)
Mutual labels:  parallel, distributed
ParallelUtilities.jl
Fast and easy parallel mapreduce on HPC clusters
Stars: ✭ 28 (-99.51%)
Mutual labels:  parallel, distributed
Lightgbm
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
Stars: ✭ 13,293 (+134.07%)
Mutual labels:  parallel, distributed
Galaxy
Galaxy is an asynchronous parallel visualization ray tracer for performant rendering in distributed computing environments. Galaxy builds upon Intel OSPRay and Intel Embree, including ray queueing and sending logic inspired by TACC GraviT.
Stars: ✭ 18 (-99.68%)
Mutual labels:  parallel, distributed
OpenABL
A domain-specific language for parallel and distributed agent-based simulations.
Stars: ✭ 24 (-99.58%)
Mutual labels:  parallel, distributed
malib
A parallel framework for population-based multi-agent reinforcement learning.
Stars: ✭ 341 (-94%)
Mutual labels:  parallel, distributed
distributed
Library to provide Erlang style distributed computations. This library is inspired by Cloud Haskell.
Stars: ✭ 49 (-99.14%)
Mutual labels:  parallel, distributed
Neuraxle
A Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
Stars: ✭ 377 (-93.36%)
Mutual labels:  parallel, hyperparameter-optimization
Sia Ui
A Graphical Frontend for Sia - https://sia.tech
Stars: ✭ 394 (-93.06%)
Mutual labels:  distributed
Machma
Easy parallel execution of commands with live feedback
Stars: ✭ 438 (-92.29%)
Mutual labels:  parallel
Libvineyard
libvineyard: an in-memory immutable data manager.
Stars: ✭ 392 (-93.1%)
Mutual labels:  distributed
Nebula
Nebula is a powerful framwork for building highly concurrent, distributed, and resilient message-driven applications for C++.
Stars: ✭ 385 (-93.22%)
Mutual labels:  distributed
Libmesh
libMesh github repository
Stars: ✭ 450 (-92.08%)
Mutual labels:  parallel
Pyrlang
Erlang node implemented in Python 3.5+ (Asyncio-based)
Stars: ✭ 436 (-92.32%)
Mutual labels:  distributed
Transmittable Thread Local
📌 TransmittableThreadLocal (TTL), the missing Java™ std lib(simple & 0-dependency) for framework/middleware, provide an enhanced InheritableThreadLocal that transmits values between threads even using thread pooling components.
Stars: ✭ 4,678 (-17.63%)
Mutual labels:  distributed
Npm Run All
A CLI tool to run multiple npm-scripts in parallel or sequential.
Stars: ✭ 4,496 (-20.83%)
Mutual labels:  parallel

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license CircleCI Read the Docs Codecov Gitter chat

Website | Docs | Install Guide | Tutorial

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

News

Help us create the next version of Optuna!

Optuna 3.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!

Please take a few minutes to fill in this survey, and let us know how you use Optuna now and what improvements you'd like.🤔

All questions optional. 🙇‍♂️ https://forms.gle/mCAttqxVg5oUifKV8

Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., classifier and svm_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

Examples

Examples can be found in optuna/optuna-examples.

Integrations

Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:

Web Dashboard (experimental)

The new Web dashboard is under the development at optuna-dashboard. It is still experimental, but much better in many regards. Feature requests and bug reports welcome!

Manage studies Visualize with interactive graphs
manage-studies optuna-realtime-graph

Install optuna-dashboard via pip:

$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.6 or newer.

Also, we also provide Optuna docker images on DockerHub.

Communication

Contribution

Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.

Reference

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].