All Projects → sile → hone

sile / hone

Licence: MIT license
A shell-friendly hyperparameter search tool inspired by Optuna

Programming Languages

rust
11053 projects

Projects that are alternatives of or similar to hone

optkeras
OptKeras: wrapper around Keras and Optuna for hyperparameter optimization
Stars: ✭ 29 (+70.59%)
Mutual labels:  hyperparameter-optimization
naturalselection
A general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (+0%)
Mutual labels:  hyperparameter-optimization
textlearnR
A simple collection of well working NLP models (Keras, H2O, StarSpace) tuned and benchmarked on a variety of datasets.
Stars: ✭ 16 (-5.88%)
Mutual labels:  hyperparameter-optimization
mlr3tuning
Hyperparameter optimization package of the mlr3 ecosystem
Stars: ✭ 44 (+158.82%)
Mutual labels:  hyperparameter-optimization
cli
Polyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (+5.88%)
Mutual labels:  hyperparameter-optimization
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (+388.24%)
Mutual labels:  hyperparameter-optimization
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+1200%)
Mutual labels:  hyperparameter-optimization
mltb
Machine Learning Tool Box
Stars: ✭ 25 (+47.06%)
Mutual labels:  hyperparameter-optimization
bbai
Set model hyperparameters using deterministic, exact algorithms.
Stars: ✭ 19 (+11.76%)
Mutual labels:  hyperparameter-optimization
ProxGradPytorch
PyTorch implementation of Proximal Gradient Algorithms a la Parikh and Boyd (2014). Useful for Auto-Sizing (Murray and Chiang 2015, Murray et al. 2019).
Stars: ✭ 28 (+64.71%)
Mutual labels:  hyperparameter-optimization
shadho
Scalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (+0%)
Mutual labels:  hyperparameter-optimization
optuna-dashboard
Real-time Web Dashboard for Optuna.
Stars: ✭ 240 (+1311.76%)
Mutual labels:  hyperparameter-optimization
ml-pipeline
Using Kafka-Python to illustrate a ML production pipeline
Stars: ✭ 90 (+429.41%)
Mutual labels:  hyperparameter-optimization
go-bayesopt
A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.
Stars: ✭ 47 (+176.47%)
Mutual labels:  hyperparameter-optimization
keras-hypetune
A friendly python package for Keras Hyperparameters Tuning based only on NumPy and hyperopt.
Stars: ✭ 47 (+176.47%)
Mutual labels:  hyperparameter-optimization
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (+300%)
Mutual labels:  hyperparameter-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (+29.41%)
Mutual labels:  hyperparameter-optimization
optuna-examples
Examples for https://github.com/optuna/optuna
Stars: ✭ 238 (+1300%)
Mutual labels:  hyperparameter-optimization
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (+17.65%)
Mutual labels:  hyperparameter-optimization
randopt
Streamlined machine learning experiment management.
Stars: ✭ 108 (+535.29%)
Mutual labels:  hyperparameter-optimization

hone

#!/bin/bash
#
# $ hone init
# $ hone run --study mnist --repeats 10 examples/pytorch-mnist.sh
# $ hone trials mnist | hone best
#
set -eux

SCRIPT_URL=https://raw.githubusercontent.com/pytorch/examples/master/mnist/main.py

LR=$(hone get lr choice 0.001 0.01 0.1 1.0)
GAMMA=$(hone get gamma choice 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9)

curl -L $SCRIPT_URL | python -u - --lr=$LR --gamma=$GAMMA --epochs 3 | tee /tmp/mnist.log

grep -oP '(?<=Test set: Average loss: )[0-9.]*' /tmp/mnist.log | tail -1 | xargs hone report

Tips

How to set timeout to a study or a trial

Please use timeout command.

# TODO: example
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].