All Projects → automl → Smac3

automl / Smac3

Licence: other
Sequential Model-based Algorithm Configuration

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Smac3

Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+948.94%)
Mutual labels:  automl, hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning, automated-machine-learning
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-93.97%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning, automl, automated-machine-learning
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (-70.57%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning, automated-machine-learning
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (-19.15%)
Mutual labels:  automl, hyperparameter-optimization, bayesian-optimization, automated-machine-learning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (-64.89%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning, automated-machine-learning
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+1796.81%)
Mutual labels:  automl, hyperparameter-optimization, bayesian-optimization, automated-machine-learning
FEDOT
Automated modeling and machine learning framework FEDOT
Stars: ✭ 312 (-44.68%)
Mutual labels:  hyperparameter-optimization, automl, automated-machine-learning
Tune Sklearn
A drop-in replacement for Scikit-Learn’s GridSearchCV / RandomizedSearchCV -- but with cutting edge hyperparameter tuning techniques.
Stars: ✭ 241 (-57.27%)
Mutual labels:  automl, bayesian-optimization, hyperparameter-tuning
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (-8.51%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-60.82%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-57.27%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Auto-Surprise
An AutoRecSys library for Surprise. Automate algorithm selection and hyperparameter tuning 🚀
Stars: ✭ 19 (-96.63%)
Mutual labels:  hyperparameter-tuning, automl, automated-machine-learning
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (-48.76%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Milano
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.
Stars: ✭ 140 (-75.18%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-96.1%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+595.04%)
Mutual labels:  automl, hyperparameter-optimization, automated-machine-learning
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (-76.6%)
Mutual labels:  automl, hyperparameter-optimization, hyperparameter-tuning
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-83.51%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, automl
Auto ml
[UNMAINTAINED] Automated machine learning for analytics & production
Stars: ✭ 1,559 (+176.42%)
Mutual labels:  automl, hyperparameter-optimization, automated-machine-learning
AutoPrognosis
Codebase for "AutoPrognosis: Automated Clinical Prognostic Modeling via Bayesian Optimization", ICML 2018.
Stars: ✭ 47 (-91.67%)
Mutual labels:  bayesian-optimization, automl, automated-machine-learning

SMAC v3 Project

Copyright (C) 2016-2018 AutoML Group

Attention: This package is a reimplementation of the original SMAC tool (see reference below). However, the reimplementation slightly differs from the original SMAC. For comparisons against the original SMAC, we refer to a stable release of SMAC (v2) in Java which can be found here.

The documentation can be found here.

Status for master branch:

Build Status Codacy Badge codecov Status

Status for the development branch

Build Status Codacy Badge codecov

OVERVIEW

SMAC is a tool for algorithm configuration to optimize the parameters of arbitrary algorithms across a set of instances. This also includes hyperparameter optimization of ML algorithms. The main core consists of Bayesian Optimization in combination with an aggressive racing mechanism to efficiently decide which of two configurations performs better.

For a detailed description of its main idea, we refer to

Hutter, F. and Hoos, H. H. and Leyton-Brown, K.
Sequential Model-Based Optimization for General Algorithm Configuration
In: Proceedings of the conference on Learning and Intelligent OptimizatioN (LION 5)

SMAC v3 is written in Python3 and continuously tested with Python 3.6 and python3.6. Its Random Forest is written in C++.

Installation

Requirements

Besides the listed requirements (see requirements.txt), the random forest used in SMAC3 requires SWIG (>= 3.0, <4.0) as a build dependency:

apt-get install swig

On Arch Linux (or any distribution with swig4 as default implementation):

pacman -Syu swig3
ln -s /usr/bin/swig-3 /usr/bin/swig

Installation via pip

SMAC3 is available on PyPI.

pip install smac

Manual Installation

git clone https://github.com/automl/SMAC3.git && cd SMAC3
cat requirements.txt | xargs -n 1 -L 1 pip install
pip install .

Installation in Anaconda

If you use Anaconda as your Python environment, you have to install three packages before you can install SMAC:

conda install gxx_linux-64 gcc_linux-64 swig

Optional dependencies

SMAC3 comes with a set of optional dependencies that can be installed using setuptools extras:

  • lhd: Latin hypercube design
  • gp: Gaussian process models

These can be installed from PyPI or manually:

# from PyPI
pip install smac[gp]

# manually
pip install .[gp,lhd]

For convenience, there is also an all meta-dependency that installs all optional dependencies:

pip install smac[all]

License

This program is free software: you can redistribute it and/or modify it under the terms of the 3-clause BSD license (please see the LICENSE file).

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

You should have received a copy of the 3-clause BSD license along with this program (see LICENSE file). If not, see https://opensource.org/licenses/BSD-3-Clause.

USAGE

The usage of SMAC v3 is mainly the same as provided with SMAC v2.08. It supports the same parameter configuration space syntax (except for extended forbidden constraints) and interface to target algorithms.

Examples

See examples/

  • examples/rosenbrock.py - example on how to optimize a Python function
  • examples/spear_qcp/run.sh - example on how to optimize the SAT solver Spear on a set of SAT formulas

Contact

SMAC3 is developed by the AutoML Group of the University of Freiburg.

If you found a bug, please report to https://github.com/automl/SMAC3/issues.

Our guidelines for contributing to this package can be found here

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].