All Projects → jakob-r → mlrHyperopt

jakob-r / mlrHyperopt

Licence: BSD-3-Clause License
Easy Hyper Parameter Optimization with mlr and mlrMBO.

Programming Languages

HTML
75241 projects
PostScript
262 projects
r
7636 projects
TeX
3793 projects

Projects that are alternatives of or similar to mlrHyperopt

Mlr
Machine Learning in R
Stars: ✭ 1,542 (+5040%)
Mutual labels:  r-package, mlr, learners
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+476.67%)
Mutual labels:  optimization, hyperparameter-optimization, r-package
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (+436.67%)
Mutual labels:  optimization, hyperparameter-optimization
Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+7426.67%)
Mutual labels:  optimization, hyperparameter-optimization
Bayesian Optimization
Python code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (+716.67%)
Mutual labels:  optimization, hyperparameter-optimization
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (+273.33%)
Mutual labels:  optimization, hyperparameter-optimization
Far Ho
Gradient based hyperparameter optimization & meta-learning package for TensorFlow
Stars: ✭ 161 (+436.67%)
Mutual labels:  optimization, hyperparameter-optimization
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+560%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+1620%)
Mutual labels:  optimization, hyperparameter-optimization
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+703.33%)
Mutual labels:  hyperparameter-optimization, tuning-parameters
mlr3tuning
Hyperparameter optimization package of the mlr3 ecosystem
Stars: ✭ 44 (+46.67%)
Mutual labels:  hyperparameter-optimization, r-package
keras-hypetune
A friendly python package for Keras Hyperparameters Tuning based only on NumPy and hyperopt.
Stars: ✭ 47 (+56.67%)
Mutual labels:  hyperparameter-optimization, tuning-parameters
Rl Baselines Zoo
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+2696.67%)
Mutual labels:  optimization, hyperparameter-optimization
Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+2270%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+2060%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+506.67%)
Mutual labels:  optimization, hyperparameter-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+210%)
Mutual labels:  optimization, hyperparameter-optimization
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+1400%)
Mutual labels:  optimization, hyperparameter-optimization
Ray
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+61723.33%)
Mutual labels:  optimization, hyperparameter-optimization
osprey
🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (+136.67%)
Mutual labels:  optimization, hyperparameter-optimization

Superseded...

⚠️ Please check out the successor: mlr3tuningspaces

mlrHyperopt

Build Status Linux Build Status Windows Coverage Status

Easy Hyper Parameter Optimization with mlr and mlrMBO.

Installation

devtools::install_github("berndbischl/ParamHelpers") # version >= 1.11 needed.
devtools::install_github("jakob-r/mlrHyperopt", dependencies = TRUE)

Purpose

mlrHyperopt aims at making hyperparameter optimization of machine learning methods super simple. It offers tuning in one line:

library(mlrHyperopt)
res = hyperopt(iris.task, learner = "classif.svm")
res
## Tune result:
## Op. pars: cost=12.6; gamma=0.0159
## mmce.test.mean=0.02

Mainly it uses the learner implemented in mlr and uses the tuning methods also available in mlr. Unfortunately mlr lacks of well defined search spaces for each learner to make hyperparameter tuning easy.

mlrHyperopt includes default search spaces for the most common machine learning methods like random forest, svm and boosting.

As the developer can not be an expert on all machine learning methods available for R and mlr, mlrHyperopt also offers a web service to share, upload and download improved search spaces.

Development Status

Web Server

ParConfigs are up- and downloaded via JSON and stored on the server in a database. It's a very basic Ruby on Rails CRUD App generated via scaffolding with tiny modifications https://github.com/jakob-r/mlrHyperoptServer. ToDo: * Voting System * Upload-/Download Count * Improve API * Return existing ID when a duplicate is uploaded (instead of error). * Allow a combined search (instead of one key value pair).

R package

Basic functionality works reliable. Maybe I will improve the optimization heuristics in the future. It still needs more default search spaces for popular learners!

Reproducibility

This package is still under construction and the inner workings might change without a version number update. Thus I do not recommend the usage for reproducible research until it is on CRAN. For reproducible research you might want to stick to the more lengthly but more precise mlr tuning workflow. You can still use the Parameter Sets recommended in mlrHyperopt. Just make sure to write them in your source code.

Collaboration

Is encouraged! 👍

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].