All Projects → mlr-org → Mlrmbo

mlr-org / Mlrmbo

Licence: other
Toolbox for Bayesian Optimization and Model-Based Optimization in R

Programming Languages

r
7636 projects

Projects that are alternatives of or similar to Mlrmbo

Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+310.98%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
mlrHyperopt
Easy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (-82.66%)
Mutual labels:  optimization, hyperparameter-optimization, r-package
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+160.12%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-46.24%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+5.2%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+1205.2%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+198.27%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-35.26%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+14.45%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (-16.76%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+163.58%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Bayesianoptimization
A Python implementation of global optimization with gaussian processes.
Stars: ✭ 5,611 (+3143.35%)
Mutual labels:  bayesian-optimization, optimization
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+226.01%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (+67.05%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-80.35%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+274.57%)
Mutual labels:  hyperparameter-optimization, optimization
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+3319.65%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Rl Baselines Zoo
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+384.97%)
Mutual labels:  hyperparameter-optimization, optimization
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6083.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
Hypertunity
A toolset for black-box hyperparameter optimisation.
Stars: ✭ 119 (-31.21%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization

mlrMBO

Package website: mlrmbo.mlr-org.com

Model-based optimization with mlr.

tic CRAN_Status_Badge Coverage Status Monthly RStudio CRAN Downloads

Installation

We recommend to install the official release version:

install.packages("mlrMBO")

For experimental use you can install the latest development version:

remotes::install_github("mlr-org/mlrMBO")

Introduction

mlrMBO is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.

Features:

  • EGO-type algorithms (Kriging with expected improvement) on purely numerical search spaces, see Jones et al. (1998)
  • Mixed search spaces with numerical, integer, categorical and subordinate parameters
  • Arbitrary parameter transformation allowing to optimize on, e.g., logscale
  • Optimization of noisy objective functions
  • Multi-Criteria optimization with approximated Pareto fronts
  • Parallelization through multi-point batch proposals
  • Parallelization on many parallel back-ends and clusters through batchtools and parallelMap

For the surrogate, mlrMBO allows any regression learner from mlr, including:

  • Kriging aka. Gaussian processes (i.e. DiceKriging)
  • random Forests (i.e. randomForest)
  • and many more…

Various infill criteria (aka. acquisition functions) are available:

  • Expected improvement (EI)
  • Upper/Lower confidence bound (LCB, aka. statistical lower or upper bound)
  • Augmented expected improvement (AEI)
  • Expected quantile improvement (EQI)
  • API for custom infill criteria

Objective functions are created with package smoof, which also offers many test functions for example runs or benchmarks.

Parameter spaces and initial designs are created with package ParamHelpers.

How to Cite

Please cite our arxiv paper (Preprint). You can get citation info via citation("mlrMBO") or copy the following BibTex entry:

@article{mlrMBO,
  title = {{{mlrMBO}}: {{A Modular Framework}} for {{Model}}-{{Based Optimization}} of {{Expensive Black}}-{{Box Functions}}},
  url = {https://arxiv.org/abs/1703.03373},
  shorttitle = {{{mlrMBO}}},
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1703.03373},
  primaryClass = {stat},
  author = {Bischl, Bernd and Richter, Jakob and Bossek, Jakob and Horn, Daniel and Thomas, Janek and Lang, Michel},
  date = {2017-03-09},
}

Some parts of the package were created as part of other publications. If you use these parts, please cite the relevant work appropriately:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].