All Projects → sherpa-ai → Sherpa

sherpa-ai / Sherpa

Licence: other
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to Sherpa

Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+1947.06%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-88.24%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-16.61%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+681.31%)
Mutual labels:  bayesian-optimization, hyperparameter-optimization, hyperparameter-tuning
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-63.67%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+78.55%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-92.39%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+95.16%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (-54.33%)
Mutual labels:  hyperparameter-optimization, machine-learning-library, hyperparameter-tuning
mlr3tuning
Hyperparameter optimization package of the mlr3 ecosystem
Stars: ✭ 44 (-84.78%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
polystores
A library for performing hyperparameter optimization
Stars: ✭ 48 (-83.39%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
naturalselection
A general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (-94.12%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
open-box
Generalized and Efficient Blackbox Optimization System.
Stars: ✭ 64 (-77.85%)
Mutual labels:  bayesian-optimization, hyperparameter-tuning
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-76.47%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-23.53%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Gpflowopt
Bayesian Optimization using GPflow
Stars: ✭ 229 (-20.76%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-71.28%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
mltb
Machine Learning Tool Box
Stars: ✭ 25 (-91.35%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
bbopt
Black box hyperparameter optimization made easy.
Stars: ✭ 66 (-77.16%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (-50.17%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization

SHERPA: A Python Hyperparameter Optimization Library

.. figure:: https://docs.google.com/drawings/d/e/2PACX-1vRaTP5d5WqT4KY4V57niI4wFDkz0098zHTRzZ9n7SzzFtdN5akBd75HchBnhYI-GPv_AYH1zYa0O2_0/pub?w=522&h=150 :figwidth: 100% :align: right :height: 150px :alt: SHERPA logo

.. image:: https://img.shields.io/badge/License-GPL%20v3-blue.svg :target: https://www.gnu.org/licenses/gpl-3.0

.. image:: https://travis-ci.org/sherpa-ai/sherpa.svg?branch=master :target: https://travis-ci.org/sherpa-ai/sherpa.svg?branch=master :alt: Build Status

.. image:: https://pepy.tech/badge/parameter-sherpa :target: https://pepy.tech/project/parameter-sherpa

SHERPA is a Python library for hyperparameter tuning of machine learning models. It provides:

  • hyperparameter optimization for machine learning researchers
  • it can be used with any Python machine learning library such as Keras, Tensorflow, PyTorch, or Scikit-Learn
  • a choice of hyperparameter optimization algorithms such as Bayesian optimization via GPyOpt (example notebook <https://github.com/sherpa-ai/sherpa/tree/master/examples/keras_mnist_mlp.ipynb>), Asynchronous Successive Halving (aka Hyperband) (example notebook <https://github.com/sherpa-ai/sherpa/tree/master/examples/keras_mnist_mlp_successive_halving.ipynb>), and Population Based Training (example notebook <https://github.com/sherpa-ai/sherpa/tree/master/examples/keras_mnist_mlp_population_based_training.ipynb>_).
  • parallel computation that can be fitted to the user's needs
  • a live dashboard for the exploratory analysis of results.

Clone from GitHub to get the latest version or install via pip install parameter-sherpa. The documentation at http://parameter-sherpa.readthedocs.io/ provides tutorials on using the different optimization algorithms and installation instructions for parallel hyperparameter optimizations. Take a look at the demo video by clicking on the image below or read on to find out more.

We would love to hear what you think of Sherpa! Tell us how we can improve via our Feedback-Form_.

.. _Feedback-Form: https://forms.gle/b3HoyJZHjQnYtv677

.. image:: http://img.youtube.com/vi/-exnF3uv0Ws/0.jpg :target: https://www.youtube.com/watch?feature=player_embedded&v=-exnF3uv0Ws

If you use SHERPA in your research please cite:

::

@article{hertel2020sherpa,
   title={Sherpa: Robust Hyperparameter Optimization for Machine Learning},
   author={Lars Hertel and Julian Collado and Peter Sadowski and Jordan Ott and Pierre Baldi},
   journal={SoftwareX},
   volume={},
   number={},
   pages={},
   note={In press.}
   year={2020},
   note  ={Also arXiv:2005.04048. Software available at: https://github.com/sherpa-ai/sherpa},
   publisher={}
}

From Keras to Sherpa in 30 seconds

This example will show how to adapt a minimal Keras script so it can be used with SHERPA. As starting point we use the "getting started in 30 seconds" tutorial from the Keras webpage.

We start out with this piece of Keras code:

::

from keras.models import Sequential
from keras.layers import Dense
model = Sequential()
model.add(Dense(units=64, activation='relu', input_dim=100))
model.add(Dense(units=10, activation='softmax'))
model.compile(loss='categorical_crossentropy',
          optimizer='sgd',
          metrics=['accuracy'])

We want to tune the number of hidden units via Random Search. To do that, we define one parameter of type Discrete. We also use the BayesianOptimization algorithm with maximum number of trials 50.

::

import sherpa
parameters = [sherpa.Discrete('num_units', [50, 200])]
alg = sherpa.algorithms.BayesianOptimization(max_num_trials=50)

We use these objects to create a SHERPA Study:

::

study = sherpa.Study(parameters=parameters,
                     algorithm=alg,
                     lower_is_better=True)

We obtain trials by iterating over the study. Each trial has a parameter attribute that contains the num_units parameter value. We can use that value to create our model.

::

for trial in study:
    model = Sequential()
    model.add(Dense(units=trial.parameters['num_units'],
                    activation='relu', input_dim=100))
    model.add(Dense(units=10, activation='softmax'))
    model.compile(loss='categorical_crossentropy',
              optimizer='sgd',
              metrics=['accuracy'])

    model.fit(x_train, y_train, epochs=5, batch_size=32,
              callbacks=[study.keras_callback(trial, objective_name='val_loss')])
    study.finalize(trial)

During training, objective values will be added to the SHERPA study via the callback. At the end of training study.finalize completes this trial. This means that no more observation will be added to this trial.

When the Study is created, SHERPA will display the dashboard address. If you put the address into your browser you will see the dashboard as shown below. As a next step you can take a look at this example of optimizing a Random Forest in sherpa/examples/randomforest.py.

.. figure:: https://drive.google.com/uc?export=view&id=1G85sfwLicsQKd3-1xN7DZowQ0gHAvzGx :alt: SHERPA Dashboard.

Installation from PyPi

::

pip install parameter-sherpa

Installation from GitHub

Clone from GitHub:

::

git clone https://github.com/LarsHH/sherpa.git
export PYTHONPATH=$PYTHONPATH:`pwd`/sherpa

Install dependencies:

::

pip install pandas
pip install numpy
pip install scipy
pip install scikit-learn
pip install flask
pip install enum34  # if on < Python 3.4

You can run an example to verify SHERPA is working:

::

cd sherpa/examples/
python simple.py

Note that to run hyperparameter optimizations in parallel with SHERPA requires the installation of Mongo DB. Further instructions can be found in the Parallel Installation section of the documentation.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].