All Projects → lacerbi → Bads

lacerbi / Bads

Licence: gpl-3.0
Bayesian Adaptive Direct Search (BADS) optimization algorithm for model fitting in MATLAB

Programming Languages

matlab
3953 projects

Projects that are alternatives of or similar to Bads

hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-49.69%)
Mutual labels:  bayesian-optimization, optimization-algorithms
Deeplearning.ai
Stars: ✭ 139 (-12.58%)
Mutual labels:  optimization-algorithms
Viz torch optim
Videos of deep learning optimizers moving on 3D problem-landscapes
Stars: ✭ 86 (-45.91%)
Mutual labels:  optimization-algorithms
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-24.53%)
Mutual labels:  bayesian-optimization
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-33.33%)
Mutual labels:  bayesian-optimization
Swarmlib
This repository implements several swarm optimization algorithms and visualizes them. Implemented algorithms: Particle Swarm Optimization (PSO), Firefly Algorithm (FA), Cuckoo Search (CS), Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Grey Wolf Optimizer (GWO) and Whale Optimization Algorithm (WOA)
Stars: ✭ 121 (-23.9%)
Mutual labels:  optimization-algorithms
Bayesian Machine Learning
Notebooks about Bayesian methods for machine learning
Stars: ✭ 1,202 (+655.97%)
Mutual labels:  bayesian-optimization
Nmflibrary
MATLAB library for non-negative matrix factorization (NMF): Version 1.8.1
Stars: ✭ 153 (-3.77%)
Mutual labels:  optimization-algorithms
Pygmo2
A Python platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 134 (-15.72%)
Mutual labels:  optimization-algorithms
Hypertunity
A toolset for black-box hyperparameter optimisation.
Stars: ✭ 119 (-25.16%)
Mutual labels:  bayesian-optimization
Cilib
Typesafe, purely functional Computational Intelligence
Stars: ✭ 114 (-28.3%)
Mutual labels:  optimization-algorithms
Optimviz
Visualize optimization algorithms in MATLAB.
Stars: ✭ 106 (-33.33%)
Mutual labels:  optimization-algorithms
Raven
RAVEN is a flexible and multi-purpose probabilistic risk analysis, uncertainty quantification, parameter optimization and data knowledge-discovering framework.
Stars: ✭ 122 (-23.27%)
Mutual labels:  optimization-algorithms
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6628.3%)
Mutual labels:  bayesian-optimization
Proxtv
Matlab and Python toolbox for fast Total Variation proximity operators
Stars: ✭ 140 (-11.95%)
Mutual labels:  optimization-algorithms
Csmath 2020
This mathematics course is taught for the first year Ph.D. students of computer science and related areas @ZJU
Stars: ✭ 85 (-46.54%)
Mutual labels:  optimization-algorithms
Bopp
BOPP: Bayesian Optimization for Probabilistic Programs
Stars: ✭ 112 (-29.56%)
Mutual labels:  bayesian-optimization
Tradingstrategies
Algorithmic trading strategies
Stars: ✭ 120 (-24.53%)
Mutual labels:  optimization-algorithms
Limbo
A lightweight framework for Gaussian processes and Bayesian optimization of black-box functions (C++-11)
Stars: ✭ 157 (-1.26%)
Mutual labels:  bayesian-optimization
Niapy
Python microframework for building nature-inspired algorithms. Official docs: http://niapy.readthedocs.io/en/stable/
Stars: ✭ 148 (-6.92%)
Mutual labels:  optimization-algorithms

Bayesian Adaptive Direct Search (BADS) - v1.0.6

News:

  • If you are interested in Bayesian model fitting, check out Variational Bayesian Monte Carlo (VBMC), a simple and user-friendly toolbox for Bayesian posterior and model inference that we published at NeurIPS (2018, 2020).
  • The BADS paper [1] has been accepted for a poster presentation at NeurIPS 2017! (20.9% acceptance rate this year, for a total of 3240 submissions)
  • BADS has also been presented at the NeurIPS workshop on Bayesian optimization for science and engineering, BayesOpt 2017.

What is it

BADS is a novel, fast Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via maximum likelihood estimation).

BADS has been intensively tested for fitting behavioral, cognitive, and neural models, and is currently being used in many computational labs around the world. In our benchmark with real model-fitting problems, BADS performed on par or better than many other common and state-of-the-art MATLAB optimizers, such as fminsearch, fmincon, and cmaes [1].

BADS is recommended when no gradient information is available, and the objective function is non-analytical or noisy, for example evaluated through numerical approximation or via simulation.

BADS requires no specific tuning and runs off-the-shelf like other built-in MATLAB optimizers such as fminsearch.

If you are interested in estimating posterior distributions (i.e., uncertainty and error bars) over parameters, and not just point estimtes, you might want to check out Variational Bayesian Monte Carlo, a toolbox for Bayesian posterior and model inference which can be used in synergy with BADS.

Installation

Download the latest version of BADS as a ZIP file.

  • To install BADS, clone or unpack the zipped repository where you want it and run the script install.m.
    • This will add the BADS base folder to the MATLAB search path.
  • To see if everything works, run bads('test').

Quick start

The BADS interface is similar to that of other MATLAB optimizers. The basic usage is:

[X,FVAL] = bads(FUN,X0,LB,UB,PLB,PUB);

with input parameters:

  • FUN, a function handle to the objective function to minimize (typically, the negative log likelihood of a dataset and model, for a given input parameter vector);
  • X0, the starting point of the optimization (a row vector);
  • LB and UB, hard lower and upper bounds;
  • PLB and PUB, plausible lower and upper bounds, that is a box where you would expect to find almost all solutions.

The output parameters are:

  • X, the found optimum.
  • FVAL, the (estimated) function value at the optimum.

For more usage examples, see bads_examples.m. You can also type help bads to display the documentation.

For practical recommendations, such as how to set LB and UB, and any other question, check out the FAQ on the BADS wiki.

Note: BADS is a semi-local optimization algorithm, in that it can escape local minima better than many other methods — but it can still get stuck. The best performance for BADS is obtained by running the algorithm multiple times from distinct starting points (see here).

How does it work

BADS follows a mesh adaptive direct search (MADS) procedure for function minimization that alternates poll steps and search steps (see Fig 1).

  • In the poll stage, points are evaluated on a mesh by taking steps in one direction at a time, until an improvement is found or all directions have been tried. The step size is doubled in case of success, halved otherwise.
  • In the search stage, a Gaussian process (GP) is fit to a (local) subset of the points evaluated so far. Then, we iteratively choose points to evaluate according to a lower confidence bound strategy that trades off between exploration of uncertain regions (high GP uncertainty) and exploitation of promising solutions (low GP mean).

Fig 1: BADS procedure BADS procedure

See here for a visualization of several optimizers at work, including BADS.

See our paper for more details [1].

Troubleshooting

If you have trouble doing something with BADS:

This project is under active development. If you find a bug, or anything that needs correction, please let me know.

BADS for other programming languages

BADS is currently available only for MATLAB. A Python version is being planned.

If you are interested in porting BADS to Python or another language (R, [email protected] (putting 'BADS' in the subject of the email); I'd be willing to help. However, before contacting me for this reason, please have a good look at the codebase here on GitHub, and at the paper [1]. BADS is a fairly complex piece of software, so be aware that porting it will require considerable effort and programming skills.

Reference

  1. Acerbi, L. & Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In Advances in Neural Information Processing Systems 30, pages 1834-1844. (link, arXiv preprint)

You can cite BADS in your work with something along the lines of

We optimized the log likelihoods of our models using Bayesian adaptive direct search (BADS; Acerbi and Ma, 2017). BADS alternates between a series of fast, local Bayesian optimization steps and a systematic, slower exploration of a mesh grid.

Besides formal citations, you can demonstrate your appreciation for BADS in the following ways:

  • Star the BADS repository on GitHub;
  • Follow me on Twitter for updates about BADS and other projects I am involved with;
  • Tell me about your model-fitting problem and your experience with BADS (positive or negative) at [email protected] (putting 'BADS' in the subject of the email).

License

BADS is released under the terms of the GNU General Public License v3.0.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].