All Projects → alonfnt → bayex

alonfnt / bayex

Licence: MIT License
Bayesian Optimization in JAX

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to bayex

ADAM
ADAM implements a collection of algorithms for calculating rigid-body dynamics in Jax, CasADi, PyTorch, and Numpy.
Stars: ✭ 51 (+112.5%)
Mutual labels:  automatic-differentiation, jax
omd
JAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (+79.17%)
Mutual labels:  automatic-differentiation, jax
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+287.5%)
Mutual labels:  bayesian-optimization
xgboost-lightgbm-hyperparameter-tuning
Bayesian Optimization and Grid Search for xgboost/lightgbm
Stars: ✭ 40 (+66.67%)
Mutual labels:  bayesian-optimization
autodiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Stars: ✭ 69 (+187.5%)
Mutual labels:  automatic-differentiation
FwiFlow.jl
Elastic Full Waveform Inversion for Subsurface Flow Problems Using Intrusive Automatic Differentiation
Stars: ✭ 24 (+0%)
Mutual labels:  automatic-differentiation
cr-sparse
Functional models and algorithms for sparse signal processing
Stars: ✭ 38 (+58.33%)
Mutual labels:  jax
AbstractOperators.jl
Abstract operators for large scale optimization in Julia
Stars: ✭ 26 (+8.33%)
Mutual labels:  automatic-differentiation
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (-29.17%)
Mutual labels:  bayesian-optimization
TensorAlgDiff
Automatic Differentiation for Tensor Algebras
Stars: ✭ 26 (+8.33%)
Mutual labels:  automatic-differentiation
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+500%)
Mutual labels:  bayesian-optimization
koclip
KoCLIP: Korean port of OpenAI CLIP, in Flax
Stars: ✭ 80 (+233.33%)
Mutual labels:  jax
dopt
A numerical optimisation and deep learning framework for D.
Stars: ✭ 28 (+16.67%)
Mutual labels:  automatic-differentiation
jaxdf
A JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations
Stars: ✭ 50 (+108.33%)
Mutual labels:  jax
FLEXS
Fitness landscape exploration sandbox for biological sequence design.
Stars: ✭ 92 (+283.33%)
Mutual labels:  bayesian-optimization
ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
Stars: ✭ 37 (+54.17%)
Mutual labels:  jax
fedpa
Federated posterior averaging implemented in JAX
Stars: ✭ 38 (+58.33%)
Mutual labels:  jax
madam
👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (+91.67%)
Mutual labels:  jax
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (+337.5%)
Mutual labels:  bayesian-optimization
HamiltonianSolver
Numerically solves equations of motion for a given Hamiltonian function
Stars: ✭ 51 (+112.5%)
Mutual labels:  automatic-differentiation

BAYEX: Bayesian Optimization powered by JAX

tests

bayesian_figure Features | Installation | Usage | Contributing

Bayex is a high performance Bayesian global optimization library using Gaussian processes. In contrast to existing Bayesian optimization libraries, Bayex is completly written in JAX.

Bayesian Optimization (BO) methods are useful for optimizing functions that are expensive to evaluate, lack an analytical expression and whose evaluations can be contaminated by noise. These methods rely typically on a Gaussian process (GP), upon which an acquisition function guides the optimization process and measures the expected utility of performing an evaluation of the objective at a new suggested point.

Features

  • High Performance: by making use of vectorization and JIT compilation provided by JAX.
  • Hardware Accelerated: Bayex can be run on CPU, but also on GPU and TPU wihtout issues.
  • Discrete variables: Support for discrete variables.
  • Multiple Acquisition Functions: Expected Improvement, Probability of Improvement, Upper/Lower Confidence Bound, etc.
  • Multiple Kernel choices: Squared Exponential, Mattern (0.5, 1.0, 1.5), Periodic, etc.

Installation

Bayex can be installed using PyPI via pip:

pip install bayex

or from GitHub directly

pip install git+git://github.com/alonfnt/bayex.git

For more advance instructions please refer to the installation guide.

Usage

Using Bayex is very straightforward:

import bayex

def f(x, y):
    return -y ** 2 - (x - y) ** 2 + 3 * x / y - 2

constrains = {'x': (-10, 10), 'y': (0, 10)}
optim_params = bayex.optim(f, constrains=constrains, seed=42, n=10)

showing the results can be done with

>> bayex.show_results(optim_params)
   #sample      target          x            y
      1        -9.84385      2.87875      3.22516
      2        -307.513     -6.13013      8.86493
      3        -19.2197      6.8417       1.9193
      4        -43.6495     -3.09738      2.52383
      5        -58.9488      2.63803      6.54768
      6        -64.8658      4.5109       7.47569
      7        -78.5649      6.91026      8.70257
      8        -9.49354      5.56705      1.43459
      9        -9.59955      5.60318      1.39322
     10        -15.4077      6.37659      1.5895
     11        -11.7703      5.83045      1.80338
     12        -11.4169      2.53303      3.32719
     13        -8.49429      2.67945      3.0094
     14        -9.17395      2.74325      3.11174
     15        -7.35265      2.86541      2.88627

we can also obtain the maximum value found using

>> optim_params.target
-7.352654457092285

as well as the input parameters that yield it

>> optim_params.params
{'x': 2.865405, 'y': 2.8862667}

Contributing

Everyone can contribute to Bayex and we welcome pull requests as well as raised issues. Please refer to this contribution guide on how to do it.

References

  1. A Tutorial on Bayesian Optimization
  2. BayesianOptimization Library
  3. JAX: Autograd and XLA
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].