All Projects → AIworx-Labs → Chocolate

AIworx-Labs / Chocolate

Licence: bsd-3-clause
A fully decentralized hyperparameter optimization framework

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Chocolate

Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+76.79%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+301.79%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+28.57%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+1916.07%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization
Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+534.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+54.46%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-16.96%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+62.5%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+360.71%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (-84.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mlrHyperopt
Easy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (-73.21%)
Mutual labels:  optimization, hyperparameter-optimization
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (+158.04%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-28.57%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-69.64%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-6.25%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
osprey
🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (-36.61%)
Mutual labels:  optimization, hyperparameter-optimization
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+307.14%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Bayesianoptimization
A Python implementation of global optimization with gaussian processes.
Stars: ✭ 5,611 (+4909.82%)
Mutual labels:  bayesian-optimization, optimization
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+5182.14%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+478.57%)
Mutual labels:  hyperparameter-optimization, optimization

Chocolate

Chocolate is a completely asynchronous optimisation framework relying solely on a database to share information between workers. Chocolate uses no master process for distributing tasks. Every task is completely independent and only gets its information from a database. Chocolate is thus ideal in controlled computing environments where it is hard to maintain a master process for the duration of the optimisation.

Chocolate has been designed and optimized for hyperparameter optimization where each function evaluation takes very long to complete and is difficult to parallelize. Chocolate allows optimization over conditional search spaces either as using conditional kernels in a Bayesian optimizer or as a multi-armed bandit problem using Thompson sampling. Chocolate also handles multi-objective optimisation where multiple loss funtions are optimized simultaneously.

Chocolate provides the following sampling/searching algorithms:

  • Grid
  • Random
  • QuasiRandom
  • CMAES
  • MOCMAES
  • Bayesian

and three useful backends:

  • SQlite
  • MongoDB
  • Pandas Data Frame

Chocolate is licensed under the 3-Clause BSD License

Documentation

The full documentation is available at http://chocolate.readthedocs.io.

Installation

Chocolate is installed using pip, unfortunately we don't have any PyPI package yet. Here is the line you have to type

pip install git+https://github.com/AIworx-Labs/[email protected]

Dependencies

Chocolate has various dependencies. While the optimizers depends on NumPy, SciPy and Scikit-Learn, the SQLite database connection depends on dataset and filelock and the MongoDB database connection depends on PyMongo. Some utilities depend on pandas. All but PyMongo will be installed with Chocolate.

Simple example

The following very simple example shows how to optimize a conditional search space in Chocolate. You'll note that a single point is sampled and evaluated in the script. Since the database connections are 'parallel' safe, you can run this script in concurrent processes and achieve maximum parallelism.

import chocolate as choco

def objective_function(condition, x=None, y=None):
    """An objective function returning ``1 - x`` when *condition* is 1 and 
    ``y - 6`` when *condition* is 2.
    
    Raises:
        ValueError: If condition is different than 1 or 2.
    """
    if condition == 1:
        return 1 - x
    elif condition == 2:
        return y - 6
    raise ValueError("condition must be 1 or 2, got {}.".format(condition))

# Define the conditional search space 
space = [
            {"condition": 1, "x": choco.uniform(low=1, high=10)},
            {"condition": 2, "y": choco.log(low=-2, high=2, base=10)}
        ]

# Establish a connection to a SQLite local database
conn = choco.SQLiteConnection("sqlite:///my_db.db")

# Construct the optimizer
sampler = choco.Bayes(conn, space)

# Sample the next point
token, params = sampler.next()

# Calculate the loss for the sampled point (minimized)
loss = objective_function(**params)

# Add the loss to the database
sampler.update(token, loss)

Have a look at the documentation tutorials for more examples.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].