All Projects → himkt → optuna-allennlp

himkt / optuna-allennlp

Licence: other
🚀 A demonstration of hyperparameter optimization using Optuna for models implemented with AllenNLP.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
Jsonnet
166 projects

Projects that are alternatives of or similar to optuna-allennlp

allennlp-optuna
⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (+94.12%)
Mutual labels:  hyperparameter-optimization, allennlp
autotune
Autonomous Performance Tuning for Kubernetes !
Stars: ✭ 84 (+394.12%)
Mutual labels:  hyperparameter-optimization
maggy
Distribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (+388.24%)
Mutual labels:  hyperparameter-optimization
naacl2019-select-pretraining-data-for-ner
BiLSTM-CRF model for NER
Stars: ✭ 15 (-11.76%)
Mutual labels:  allennlp
allennlp imdb
AllenNLP Startup Guide
Stars: ✭ 13 (-23.53%)
Mutual labels:  allennlp
optuna-examples
Examples for https://github.com/optuna/optuna
Stars: ✭ 238 (+1300%)
Mutual labels:  hyperparameter-optimization
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (+29.41%)
Mutual labels:  hyperparameter-optimization
miraiml
MiraiML: asynchronous, autonomous and continuous Machine Learning in Python
Stars: ✭ 23 (+35.29%)
Mutual labels:  hyperparameter-optimization
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (+17.65%)
Mutual labels:  hyperparameter-optimization
keras-hypetune
A friendly python package for Keras Hyperparameters Tuning based only on NumPy and hyperopt.
Stars: ✭ 47 (+176.47%)
Mutual labels:  hyperparameter-optimization
randopt
Streamlined machine learning experiment management.
Stars: ✭ 108 (+535.29%)
Mutual labels:  hyperparameter-optimization
textlearnR
A simple collection of well working NLP models (Keras, H2O, StarSpace) tuned and benchmarked on a variety of datasets.
Stars: ✭ 16 (-5.88%)
Mutual labels:  hyperparameter-optimization
hone
A shell-friendly hyperparameter search tool inspired by Optuna
Stars: ✭ 17 (+0%)
Mutual labels:  hyperparameter-optimization
ml-pipeline
Using Kafka-Python to illustrate a ML production pipeline
Stars: ✭ 90 (+429.41%)
Mutual labels:  hyperparameter-optimization
osprey
🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (+317.65%)
Mutual labels:  hyperparameter-optimization
wandb-allennlp
Utilities and boilerplate code to use wandb with allennlp
Stars: ✭ 20 (+17.65%)
Mutual labels:  allennlp
mltb
Machine Learning Tool Box
Stars: ✭ 25 (+47.06%)
Mutual labels:  hyperparameter-optimization
cmaes
Python library for CMA Evolution Strategy.
Stars: ✭ 174 (+923.53%)
Mutual labels:  hyperparameter-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+447.06%)
Mutual labels:  hyperparameter-optimization
athnlp-labs
Athens NLP Summer School Labs
Stars: ✭ 41 (+141.18%)
Mutual labels:  allennlp

experimental_result

Optuna using AllenNLP

Demonstration for using Optuna with AllenNLP integration.

Quick Start

Google Colab

Open in Colab

On your computer

# create virtual environment
python3 -m venv venv
. venv/bin/activate

# install libraries
pip install -r requirements.txt

# train a model using AllenNLP cli
allennlp train -s result/allennlp config/imdb_baseline.jsonnet

# run hyperparameter optimization
python optuna_train.py

# define-and-run style example
python optuna_train_custom_trainer.py --device 0 --target_metric accuracy --base_serialization_dir result

[New!!] Use allennlp-optuna

You can use allennlp-optuna, an AllenNLP plugin for hyperparameter optimization.

# Installation
pip install allennlp-optuna

# You need to register allennlp-optuna to allennlp using .allennlp_plugins
# It is not required if .allennlp_plugins already exists on your working directory
echo 'allennlp_optuna' >> .allennlp_plugins

# optimization
allennlp tune config/imdb_optuna.jsonnet config/hparams.json --serialization-dir result

Attention!

Demonstration uses GPU. If you want to run the scripts in this repository, please update cuda_device = -1 in allennlp config and optuna_config.

Blog Articles

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].