All Projects → seba-1511 → randopt

seba-1511 / randopt

Licence: Apache-2.0 license
Streamlined machine learning experiment management.

Programming Languages

HTML
75241 projects
python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to randopt

allennlp-optuna
⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (-69.44%)
Mutual labels:  hyperparameters, hyperparameter-optimization
optkeras
OptKeras: wrapper around Keras and Optuna for hyperparameter optimization
Stars: ✭ 29 (-73.15%)
Mutual labels:  hyperparameters, hyperparameter-optimization
go-bayesopt
A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.
Stars: ✭ 47 (-56.48%)
Mutual labels:  hyperparameter-optimization
lightning-hydra-template
PyTorch Lightning + Hydra. A very user-friendly template for rapid and reproducible ML experimentation with best practices. ⚡🔥⚡
Stars: ✭ 1,905 (+1663.89%)
Mutual labels:  experiments
outfit
👗 Tidy up your machine learning experiments
Stars: ✭ 17 (-84.26%)
Mutual labels:  experiments
forecastVeg
A Machine Learning Approach to Forecasting Remotely Sensed Vegetation Health in Python
Stars: ✭ 44 (-59.26%)
Mutual labels:  hyperparameters
cli
Polyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (-83.33%)
Mutual labels:  hyperparameter-optimization
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-37.04%)
Mutual labels:  hyperparameter-optimization
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-23.15%)
Mutual labels:  hyperparameter-optimization
alyx
Database for experimental neuroscience laboratories
Stars: ✭ 39 (-63.89%)
Mutual labels:  experiments
bbai
Set model hyperparameters using deterministic, exact algorithms.
Stars: ✭ 19 (-82.41%)
Mutual labels:  hyperparameter-optimization
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+123.15%)
Mutual labels:  hyperparameter-optimization
methods-guides
EGAP methods guides
Stars: ✭ 20 (-81.48%)
Mutual labels:  experiments
litmus-go
No description or website provided.
Stars: ✭ 49 (-54.63%)
Mutual labels:  experiments
pixel-experiments
Various experiments using the pixel library
Stars: ✭ 86 (-20.37%)
Mutual labels:  experiments
naturalselection
A general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (-84.26%)
Mutual labels:  hyperparameter-optimization
shadho
Scalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (-84.26%)
Mutual labels:  hyperparameter-optimization
optuna-dashboard
Real-time Web Dashboard for Optuna.
Stars: ✭ 240 (+122.22%)
Mutual labels:  hyperparameter-optimization
ml-pipeline
Using Kafka-Python to illustrate a ML production pipeline
Stars: ✭ 90 (-16.67%)
Mutual labels:  hyperparameter-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-79.63%)
Mutual labels:  hyperparameter-optimization


Build Status PyPI version

randopt is a Python package for machine learning experiment management, hyper-parameter optimization, and results visualization. Some of its features include:

  • result logging and management,
  • human-readable format,
  • support for parallelism / distributed / asynchronous experiments,
  • command-line and programmatic API,
  • shareable, flexible Web visualization,
  • automatic hyper-parameter search, and
  • pure Python - no dependencies !

Installation

pip install randopt

Usage

import randopt as ro

def loss(x):
    return x**2

e = ro.Experiment('myexp', {
        'alpha': ro.Gaussian(mean=0.0, std=1.0, dtype='float'),
    })

# Sampling parameters
for i in xrange(100):
    e.sample('alpha')
    res = loss(e.alpha)
    print('Result: ', res)
    e.add_result(res)

# Manually setting parameters
e.alpha = 0.00001
res = loss(e.alpha)
e.add_result(res)

# Search over all experiments results, including ones from previous runs
opt = e.minimum()
print('Best result: ', opt.result, ' with params: ', opt.params)

Results Visualization

Once you obtained some results, run roviz.py path/to/experiment/folder to visualize them in your web browser.

For more info on visualization and roviz.py, refer to the Visualizing Results tutorial.

Hyper-Parameter Optimization

To generate results and search for good hyper-parameters you can either user ropt.py or write your own optimizaiton script using the Evolutionary and GridSearch classes.

For more info on hyper-parameter optimization, refer to the Optimizing Hyperparams tutorial.

Documentation

For more examples, tutorials, and documentation refer to the wiki.

Contributing

To contribute to Randopt, it is recommended to follow the contribution guidelines.

Acknowledgements

Randopt is maintained by Séb Arnold, with numerous contributions from the following persons.

  • Noel Trivedi
  • Cyrus Jia
  • Daler Asrorov

License

Randopt is released under the Apache 2 License. For more information, refer to the LICENSE file.

I would love to hear how your use Randopt. Feel free to drop me a line !

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].