All Projects → aspuru-guzik-group → phoenics

aspuru-guzik-group / phoenics

Licence: Apache-2.0 License
Phoenics: Bayesian optimization for efficient experiment planning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to phoenics

Bayesian-Optimization
Bayesian Optimization algorithms with various recent improvements
Stars: ✭ 70 (+2.94%)
Mutual labels:  bayesian-optimization
GPim
Gaussian processes and Bayesian optimization for images and hyperspectral data
Stars: ✭ 29 (-57.35%)
Mutual labels:  bayesian-optimization
hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (+17.65%)
Mutual labels:  bayesian-optimization
PHYSBO
PHYSBO -- optimization tools for PHYsics based on Bayesian Optimization
Stars: ✭ 40 (-41.18%)
Mutual labels:  bayesian-optimization
keras gpyopt
Using Bayesian Optimization to optimize hyper parameter in Keras-made neural network model.
Stars: ✭ 56 (-17.65%)
Mutual labels:  bayesian-optimization
FLEXS
Fitness landscape exploration sandbox for biological sequence design.
Stars: ✭ 92 (+35.29%)
Mutual labels:  bayesian-optimization
AutoPrognosis
Codebase for "AutoPrognosis: Automated Clinical Prognostic Modeling via Bayesian Optimization", ICML 2018.
Stars: ✭ 47 (-30.88%)
Mutual labels:  bayesian-optimization
SG MCMC
Implementation of Stochastic Gradient MCMC algorithms
Stars: ✭ 37 (-45.59%)
Mutual labels:  bayesian-optimization
deodorant
Deodorant: Solving the problems of Bayesian Optimization
Stars: ✭ 15 (-77.94%)
Mutual labels:  bayesian-optimization
xgboost-lightgbm-hyperparameter-tuning
Bayesian Optimization and Grid Search for xgboost/lightgbm
Stars: ✭ 40 (-41.18%)
Mutual labels:  bayesian-optimization
ESNAC
Learnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-60.29%)
Mutual labels:  bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-50%)
Mutual labels:  bayesian-optimization
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (+54.41%)
Mutual labels:  bayesian-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-67.65%)
Mutual labels:  bayesian-optimization
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (-75%)
Mutual labels:  bayesian-optimization
MIP-EGO
Mixed-Integer Parallel Efficient Global Optimization
Stars: ✭ 26 (-61.76%)
Mutual labels:  bayesian-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+36.76%)
Mutual labels:  bayesian-optimization
open-box
Generalized and Efficient Blackbox Optimization System [SIGKDD'21].
Stars: ✭ 174 (+155.88%)
Mutual labels:  bayesian-optimization
bayex
Bayesian Optimization in JAX
Stars: ✭ 24 (-64.71%)
Mutual labels:  bayesian-optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+111.76%)
Mutual labels:  bayesian-optimization

Phoenics

Phoenics is an open source optimization algorithm combining ideas from Bayesian optimization with Bayesian Kernel Density estimation [1]. It performs global optimization on expensive to evaluate objectives, such as physical experiments or demanding computations. Phoenics supports sequential and batch optimizations and allows for the simultaneous optimization of multiple objectives via the Chimera scalarizing function [2].

Check out the examples folder for detailed descriptions and code examples for:

Example Link
Sequential optimization examples/optimization_sequential
Parallelizable batch optimization examples/optimization_parallel
Periodic parameter support examples/optimization_periodic_parameters
Multi-objective optimization examples/optimization_multiple_objectives

More elaborate applications of Phoenics and Chimera are listed below

Application Link
Auto-calibration of a virtual robot examples/application_robot_calibration

Chimera

Chimera is a general purpose achievement scalarizing function for multi-objective optimization. User preferences regarding the objectives are expected in terms of an importance hierarchy, as well as relative tolerances on each objective indicating what level of degradation is acceptable. Chimera is integrated into Phoenics, but also available for download as a wrapper for other optimization methods (see chimera).

Installation

You can install Phoenics via pip

apt-get install python-pip
pip install phoenics

or by creating a conda environment from the provided environment file

conda env create -f environment.yml
source activate phoenics

Alternatively, you can also choose to build Phoenics from source by cloning this repository

git clone https://github.com/aspuru-guzik-group/phoenics.git
Requirements

This code has been tested with Python 3.6 and uses

  • cython 0.27.3
  • json 2.0.9
  • numpy 1.13.1
  • scipy 0.19.1

Phoenics can construct its probabilistic model with two different probabilistic modeling libraries: PyMC3 and Edward. Depending on your preferences, you will either need

  • pymc3 3.2
  • theano 1.0.1

or

  • edward 1.3.5
  • tensorflow 1.4.1

Check out the environment.yml file for more details.

Using Phoenics

Phoenics is designed to suggest new parameter points based on prior observations. The suggested parameters can then be passed on to objective evaluations (experiments or involved computation). As soon as the objective values have been determined for a set of parameters, these new observations can again be passed on to Phoenics to request new, more informative parameters.

from phoenics import Phoenics
    
# create an instance from a configuration file
config_file = 'config.json'
phoenics    = Phoenics(config_file)
    
# request new parameters from a set of observations
params      = phoenics.choose(observations = observations)

Detailed examples for specific applications are presented in the examples folder.

Using Chimera

Chimera is integrated into Phoenics, but also available as a stand-alone wrapper for other single-objective optimization algorithms. The Chimera wrapper allows to cast a set of objectives for a number of observations into a single objective value for each observation, enabling single-objective optimization algorithms to solve the multi-objective optimization problem. The usage of Chimera is outlined below on an example with four objective functions

from chimera import Chimera

# define tolerances in descending order of importance
tolerances = [0.25, 0.1, 0.25, 0.05]

# create Chimera instance
chimera = Chimera(tolerances)

# cast objectives of shape      [num_observations, num_objectives]
# into single objective vector  [num_observations, 1]
single_objectives = chimera.scalarize_objectives(objectives)

Note: Phoenics automatically employs Chimera when the configuration contains more than one objective.

Disclaimer

Note: This repository is under construction! We hope to add further details on the method, instructions and more examples in the near future.

Experiencing problems?

Please create a new issue and describe your problem in detail so we can fix it.

References

[1] Häse, F., Roch, L. M., Kreisbeck, C., & Aspuru-Guzik, A. Phoenics: A Bayesian Optimizer for Chemistry. ACS central science 4.6 (2018): 1134-1145.

[2] Häse, F., Roch, L. M., & Aspuru-Guzik, A. Chimera: enabling hierarchy based multi-objective optimization for self-driving laboratories. Chemical Science (2018).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].