All Projects → mlr-org → mlr3tuning

mlr-org / mlr3tuning

Licence: LGPL-3.0 license
Hyperparameter optimization package of the mlr3 ecosystem

Programming Languages

r
7636 projects

Projects that are alternatives of or similar to mlr3tuning

Hyperopt Keras Cnn Cifar 100
Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another classification task.
Stars: ✭ 95 (+115.91%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (+265.91%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (+200%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Rl Baselines Zoo
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+1806.82%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (+350%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Mgo
Purely functional genetic algorithms for multi-objective optimisation
Stars: ✭ 63 (+43.18%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Milano
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.
Stars: ✭ 140 (+218.18%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hyperband
Tuning hyperparams fast with Hyperband
Stars: ✭ 555 (+1161.36%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (+327.27%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+293.18%)
Mutual labels:  hyperparameter-optimization, r-package
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+1372.73%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+402.27%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+13345.45%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Determined
Determined: Deep Learning Training Platform
Stars: ✭ 1,171 (+2561.36%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+1181.82%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Pbt
Population Based Training (in PyTorch with sqlite3). Status: Unsupported
Stars: ✭ 138 (+213.64%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Neuraxle
A Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
Stars: ✭ 377 (+756.82%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+1072.73%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (+277.27%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Mlr
Machine Learning in R
Stars: ✭ 1,542 (+3404.55%)
Mutual labels:  tuning, r-package

mlr3tuning

Package website: release | dev

r-cmd-check CRAN Status StackOverflow Mattermost

mlr3tuning is the hyperparameter optimization package of the mlr3 ecosystem. It features highly configurable search spaces via the paradox package and finds optimal hyperparameter configurations for any mlr3 learner. mlr3tuning works with several optimization algorithms e.g. Random Search, Iterated Racing, Bayesian Optimization (in mlr3mbo) and Hyperband (in mlr3hyperband). Moreover, it can automatically optimize learners and estimate the performance of optimized models with nested resampling. The package is built on the optimization framework bbotk.

Extension packages

mlr3tuning is extended by the following packages.

  • mlr3tuningspaces is a collection of search spaces from scientific articles for commonly used learners.
  • mlr3hyperband adds the Hyperband and Successive Halving algorithm.
  • mlr3mbo adds Bayesian Optimization methods.

Resources

There are several sections about hyperparameter optimization in the mlr3book.

The gallery features a collection of case studies and demos about optimization.

The cheatsheet summarizes the most important functions of mlr3tuning.

Installation

Install the last release from CRAN:

install.packages("mlr3tuning")

Install the development version from GitHub:

remotes::install_github("mlr-org/mlr3tuning")

Examples

We optimize the cost and gamma hyperparameters of a support vector machine on the Sonar data set.

library("mlr3verse")

learner = lrn("classif.svm",
  cost  = to_tune(1e-5, 1e5, logscale = TRUE),
  gamma = to_tune(1e-5, 1e5, logscale = TRUE),
  kernel = "radial",
  type = "C-classification"
)

We construct a tuning instance with the ti() function. The tuning instance describes the tuning problem.

instance = ti(
  task = tsk("sonar"),
  learner = learner,
  resampling = rsmp("cv", folds = 3),
  measures = msr("classif.ce"),
  terminator = trm("none")
)
instance
## <TuningInstanceSingleCrit>
## * State:  Not optimized
## * Objective: <ObjectiveTuning:classif.svm_on_sonar>
## * Search Space:
##       id    class     lower    upper nlevels
## 1:  cost ParamDbl -11.51293 11.51293     Inf
## 2: gamma ParamDbl -11.51293 11.51293     Inf
## * Terminator: <TerminatorNone>

We select a simple grid search as the optimization algorithm.

tuner = tnr("grid_search", resolution = 5)
tuner
## <TunerGridSearch>: Grid Search
## * Parameters: resolution=5, batch_size=1
## * Parameter classes: ParamLgl, ParamInt, ParamDbl, ParamFct
## * Properties: dependencies, single-crit, multi-crit
## * Packages: mlr3tuning

To start the tuning, we simply pass the tuning instance to the tuner.

tuner$optimize(instance)
##        cost     gamma learner_param_vals  x_domain classif.ce
## 1: 11.51293 -5.756463          <list[4]> <list[2]>  0.1779158

The tuner returns the best hyperparameter configuration and the corresponding measured performance.

The archive contains all evaluated hyperparameter configurations.

as.data.table(instance$archive)[, .(cost, gamma, classif.ce, batch_nr, resample_result)]
##           cost      gamma classif.ce batch_nr      resample_result
##  1:  11.512925  -5.756463  0.1779158        1 <ResampleResult[21]>
##  2:  11.512925  11.512925  0.4662526        2 <ResampleResult[21]>
##  3:   5.756463   5.756463  0.4662526        3 <ResampleResult[21]>
##  4:  -5.756463  -5.756463  0.4662526        4 <ResampleResult[21]>
##  5: -11.512925   0.000000  0.4662526        5 <ResampleResult[21]>
## ---                                                               
## 21: -11.512925  -5.756463  0.4662526       21 <ResampleResult[21]>
## 22:  11.512925   0.000000  0.4662526       22 <ResampleResult[21]>
## 23:   5.756463  11.512925  0.4662526       23 <ResampleResult[21]>
## 24:  11.512925 -11.512925  0.2498965       24 <ResampleResult[21]>
## 25: -11.512925   5.756463  0.4662526       25 <ResampleResult[21]>

The mlr3viz package visualizes tuning results.

library(mlr3viz)

autoplot(instance, type = "surface")

We fit a final model with optimized hyperparameters to make predictions on new data.

learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("sonar"))
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].