All Projects → msmbuilder → osprey

msmbuilder / osprey

Licence: Apache-2.0 license
🦅Hyperparameter optimization for machine learning pipelines 🦅

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects
TeX
3793 projects

Projects that are alternatives of or similar to osprey

Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+3080.28%)
Mutual labels:  optimization, scikit-learn, hyperparameter-optimization
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+156.34%)
Mutual labels:  optimization, scikit-learn, hyperparameter-optimization
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+812.68%)
Mutual labels:  optimization, scikit-learn, hyperparameter-optimization
Chocolate
A fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (+57.75%)
Mutual labels:  optimization, hyperparameter-optimization
Gradient Free Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+901.41%)
Mutual labels:  optimization, hyperparameter-optimization
Rl Baselines Zoo
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+1081.69%)
Mutual labels:  optimization, hyperparameter-optimization
Far Ho
Gradient based hyperparameter optimization & meta-learning package for TensorFlow
Stars: ✭ 161 (+126.76%)
Mutual labels:  optimization, hyperparameter-optimization
Rl Baselines3 Zoo
A collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (+126.76%)
Mutual labels:  optimization, hyperparameter-optimization
Mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (+143.66%)
Mutual labels:  optimization, hyperparameter-optimization
Bayesian Optimization
Python code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (+245.07%)
Mutual labels:  optimization, hyperparameter-optimization
Cornell Moe
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+178.87%)
Mutual labels:  optimization, hyperparameter-optimization
Ray
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+26022.54%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+626.76%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperlearn
50% faster, 50% less RAM Machine Learning. Numba rewritten Sklearn. SVD, NNMF, PCA, LinearReg, RidgeReg, Randomized, Truncated SVD/PCA, CSR Matrices all 50+% faster
Stars: ✭ 1,204 (+1595.77%)
Mutual labels:  optimization, scikit-learn
Simple
Experimental Global Optimization Algorithm
Stars: ✭ 450 (+533.8%)
Mutual labels:  optimization, hyperparameter-optimization
mlrHyperopt
Easy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (-57.75%)
Mutual labels:  optimization, hyperparameter-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+30.99%)
Mutual labels:  optimization, hyperparameter-optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+102.82%)
Mutual labels:  optimization, hyperparameter-optimization
cli
Polyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (-74.65%)
Mutual labels:  scikit-learn, hyperparameter-optimization
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-4.23%)
Mutual labels:  scikit-learn, hyperparameter-optimization

Osprey

Build Status Coverage Status PyPi version License DOI Research software impact Documentation

Logo

Osprey is an easy-to-use tool for hyperparameter optimization of machine learning algorithms in Python using scikit-learn (or using scikit-learn compatible APIs).

Each Osprey experiment combines an dataset, an estimator, a search space (and engine), cross validation and asynchronous serialization for distributed parallel optimization of model hyperparameters.

Documentation

For full documentation, please visit the Osprey homepage.

Installation

If you have an Anaconda Python distribution, installation is as easy as:

$ conda install -c omnia osprey

You can also install Osprey with pip:

$ pip install osprey

Alternatively, you can install directly from this GitHub repo:

$ git clone https://github.com/msmbuilder/osprey.git
$ cd osprey && git checkout 1.1.0
$ python setup.py install

Example using MSMBuilder

Below is an example of an osprey config file to cross validate Markov state models based on varying the number of clusters and dihedral angles used in a model:

estimator:
  eval_scope: msmbuilder
  eval: |
    Pipeline([
        ('featurizer', DihedralFeaturizer(types=['phi', 'psi'])),
        ('cluster', MiniBatchKMeans()),
        ('msm', MarkovStateModel(n_timescales=5, verbose=False)),
    ])

search_space:
  cluster__n_clusters:
    min: 10
    max: 100
    type: int
  featurizer__types:
    choices:
      - ['phi', 'psi']
      - ['phi', 'psi', 'chi1']
   type: enum

cv: 5

dataset_loader:
  name: mdtraj
  params:
    trajectories: ~/local/msmbuilder/Tutorial/XTC/*/*.xtc
    topology: ~/local/msmbuilder/Tutorial/native.pdb
    stride: 1

trials:
    uri: sqlite:///osprey-trials.db

Then run osprey worker. You can run multiple parallel instances of osprey worker simultaneously on a cluster too.

$ osprey worker config.yaml

...

----------------------------------------------------------------------
Beginning iteration                                              1 / 1
----------------------------------------------------------------------
History contains: 0 trials
Choosing next hyperparameters with random...
  {'cluster__n_clusters': 20, 'featurizer__types': ['phi', 'psi']}

Fitting 5 folds for each of 1 candidates, totalling 5 fits
[Parallel(n_jobs=1)]: Done   1 jobs       | elapsed:    0.3s
[Parallel(n_jobs=1)]: Done   5 out of   5 | elapsed:    1.8s finished
---------------------------------
Success! Model score = 4.080646
(best score so far   = 4.080646)
---------------------------------

1/1 models fit successfully.
time:         October 27, 2014 10:44 PM
elapsed:      4 seconds.
osprey worker exiting.

You can dump the database to JSON or CSV with osprey dump.

Dependencies

  • python>=2.7.11
  • six>=1.10.0
  • pyyaml>=3.11
  • numpy>=1.10.4
  • scipy>=0.17.0
  • scikit-learn>=0.17.0
  • sqlalchemy>=1.0.10
  • bokeh>=0.12.0
  • matplotlib>=1.5.0
  • pandas>=0.18.0
  • GPy (optional, required for gp strategy)
  • hyperopt (optional, required for hyperopt_tpe strategy)
  • nose (optional, for testing)

Contributing

In case you encounter any issues with this package, please consider submitting a ticket to the GitHub Issue Tracker. We also welcome any feature requests and highly encourage users to submit pull requests for bug fixes and improvements.

For more detailed information, please refer to our documentation.

Citing

If you use Osprey in your research, please cite:

@misc{osprey,
  author       = {Robert T. McGibbon and
                  Carlos X. Hernández and
                  Matthew P. Harrigan and
                  Steven Kearnes and
                  Mohammad M. Sultan and
                  Stanislaw Jastrzebski and
                  Brooke E. Husic and
                  Vijay S. Pande},
  title        = {Osprey: Hyperparameter Optimization for Machine Learning},
  month        = sep,
  year         = 2016,
  doi          = {10.21105/joss.000341},
  url          = {http://dx.doi.org/10.21105/joss.00034}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].