All Projects → baptistar → Bocs

baptistar / Bocs

Licence: gpl-3.0
Bayesian Optimization of Combinatorial Structures

Programming Languages

matlab
3953 projects

Projects that are alternatives of or similar to Bocs

SG MCMC
Implementation of Stochastic Gradient MCMC algorithms
Stars: ✭ 37 (-37.29%)
Mutual labels:  bayesian-optimization
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (+389.83%)
Mutual labels:  bayesian-optimization
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+855.93%)
Mutual labels:  bayesian-optimization
phoenics
Phoenics: Bayesian optimization for efficient experiment planning
Stars: ✭ 68 (+15.25%)
Mutual labels:  bayesian-optimization
Multiobjective EGO algorithms
The standard and parallel multiobjective EGO algorithms
Stars: ✭ 22 (-62.71%)
Mutual labels:  bayesian-optimization
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+662.71%)
Mutual labels:  bayesian-optimization
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (-71.19%)
Mutual labels:  bayesian-optimization
Ts Emo
This repository contains the source code for “Thompson sampling efficient multiobjective optimization” (TSEMO).
Stars: ✭ 39 (-33.9%)
Mutual labels:  bayesian-optimization
ytopt
ytopt: machine-learning-based search methods for autotuning
Stars: ✭ 17 (-71.19%)
Mutual labels:  bayesian-optimization
Bayesianoptimization
A Python implementation of global optimization with gaussian processes.
Stars: ✭ 5,611 (+9410.17%)
Mutual labels:  bayesian-optimization
AutoOED
AutoOED: Automated Optimal Experimental Design Platform
Stars: ✭ 87 (+47.46%)
Mutual labels:  bayesian-optimization
sequential-gallery
Sequential Gallery for Interactive Visual Design Optimization [SIGGRAPH 2020]
Stars: ✭ 15 (-74.58%)
Mutual labels:  bayesian-optimization
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+672.88%)
Mutual labels:  bayesian-optimization
open-box
Generalized and Efficient Blackbox Optimization System [SIGKDD'21].
Stars: ✭ 174 (+194.92%)
Mutual labels:  bayesian-optimization
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+9927.12%)
Mutual labels:  bayesian-optimization
bayex
Bayesian Optimization in JAX
Stars: ✭ 24 (-59.32%)
Mutual labels:  bayesian-optimization
Emukit
A Python-based toolbox of various methods in uncertainty quantification and statistical emulation: multi-fidelity, experimental design, Bayesian optimisation, Bayesian quadrature, etc.
Stars: ✭ 316 (+435.59%)
Mutual labels:  bayesian-optimization
Bayeso
Simple, but essential Bayesian optimization package
Stars: ✭ 57 (-3.39%)
Mutual labels:  bayesian-optimization
Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+1105.08%)
Mutual labels:  bayesian-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+774.58%)
Mutual labels:  bayesian-optimization

Bayesian Optimization of Combinatorial Structures (BOCS)

What is the BOCS algorithm?

The BOCS algorithm is a method for finding a global minimizer of an expensive-to-evaluate black-box function that is defined over discrete inputs given a finite budget of function evaluations. The algorithm combines an adaptive generative model and semidefinite programming techniques for scalability of the acquisition function to large combinatorial domains. For more information on the details of the algorithm, please see the ICML paper.

Authors

Ricardo Baptista (MIT) and Matthias Poloczek (University of Arizona)

E-mails: [email protected], [email protected]

Installation

The BOCS algorithm is implemented in MATLAB and only requires the CVX package to be available in the local path for performing convex optimization. The scripts for comparing to other discrete optimization methods require an installation of SMAC, the python wrapper pySMAC and the bayesopt function in MATLAB for Bayesian Optimization with the Expected Improvement acquisition function.

A python implementation of the BOCS algorithm is also available in the BOCSpy folder. The code has been tested with Python 2.7 and 3.5 and requires the cvxpy and cvxopt packages.

Example running BOCS on a benchmark problem

We provide an example for running the BOCS algorithm the Ising model sparsification benchmark problem on a 9-node grid graph with a budget of 100 sample evaluations. The code can also be run from MATLAB using the file scripts/example_ising.m.

The script first defines the input parameters in the inputs struct. These include the number of discrete variables (i.e., edges in the grid graph) n_vars, the sample evaluation budget evalBudget, the number of samples initially used to build the statistical model n_init, the lambda parameter lambda, and the horseshoe estimator used for the regression model. Other options for estimator are bayes and mle.


inputs = struct;
inputs.n_vars     = 12;
inputs.evalBudget = 100;
inputs.n_init     = 20;
inputs.lambda     = 1e-4;
inputs.estimator  = 'horseshoe';

We randomly instantiate the edge weights of the Ising model and define the objective function based on the KL divergence in inputs.model. We add an l_1 penalty scaled by the lambda parameter in inputs.penalty.


% Generate random graphical model
Theta   = rand_ising_grid(9);
Moments = ising_model_moments(Theta);

% Save objective function and regularization term
inputs.model    = @(x) KL_divergence_ising(Theta, Moments, x);
inputs.penalty  = @(x) inputs.lambda*sum(x,2);

We sample the initial values for the statistical model and evaluate the objective function at these samples. Using these inputs, we run the BOCS-SA and BOCS-SDP algorithms with the order 2 regression model.


% Generate initial samples for statistical models
inputs.x_vals   = sample_models(inputs.n_init, inputs.n_vars);
inputs.y_vals   = inputs.model(inputs.x_vals);

% Run BOCS-SDP and BOCS-SA (order 2)
B_SA  = BOCS(inputs.model, inputs.penalty, inputs, 2, 'SA');
B_SDP = BOCS(inputs.model, inputs.penalty, inputs, 2, 'sdp');

We also provide an example of running the python implementation of BOCS on the binary quadratic programming benchmark problem. The code can be run using the command python BOCSpy/example_bqp.py. The script produces a plot with the convergence results from running BOCS-SA and BOCS-SDP on an instance of the problem.

Comparison to other discrete optimization algorithms

To compare BOCS to other algorithms, we provided files in the scripts folder that run all cases for the quadratic programming problem, the Ising model sparsification problem, and the contamination control problem. The algorithms that are compared include:

To run these examples, we provided three files in the scripts folder named opt_runs_quad, opt_runs_ising and opt_runs_cont for the benchmark problems. In each of the files, the n_func parameter defines multiple random instances of the test function and the n_runs parameter defines the number of repeated evaluations of each algorithm starting from different initial datasets. The n_proc parameter can be used to specify the number of available processors to parallelize the execution of these tests. Lastly, running scripts/create_plots will plot the average output of these tests and present statistics of the solutions found by all algorithms, while running scripts/cross_validate followed by scripts/plot_crossval can be used to validate the statistical models for each benchmark problem.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].