All Projects → keras-team → Keras Tuner

keras-team / Keras Tuner

Licence: apache-2.0
Hyperparameter tuning for humans

Programming Languages

python
139335 projects - #7 most used programming language

KerasTuner

codecov PyPI version

KerasTuner is an easy-to-use, scalable hyperparameter optimization framework that solves the pain points of hyperparameter search. Easily configure your search space with a define-by-run syntax, then leverage one of the available search algorithms to find the best hyperparameter values for your models. KerasTuner comes with Bayesian Optimization, Hyperband, and Random Search algorithms built-in, and is also designed to be easy for researchers to extend in order to experiment with new search algorithms.

Official Website: https://keras.io/keras_tuner/


Quick links


Installation

KerasTuner requires Python 3.6+ and TensorFlow 2.0+.

Install the latest release:

pip install keras-tuner --upgrade

You can also check out other versions in our GitHub repository.


Quick introduction

Import KerasTuner and TensorFlow:

import keras_tuner as kt
from tensorflow import keras

Write a function that creates and returns a Keras model. Use the hp argument to define the hyperparameters during model creation.

def build_model(hp):
  model = keras.Sequential()
  model.add(keras.layers.Dense(
      hp.Choice('units', [8, 16, 32]),
      activation='relu'))
  model.add(keras.layers.Dense(1, activation='relu'))
  model.compile(loss='mse')
  return model

Initialize a tuner (here, RandomSearch). We use objective to specify the objective to select the best models, and we use max_trials to specify the number of different models to try.

tuner = kt.RandomSearch(
    build_model,
    objective='val_loss',
    max_trials=5)

Start the search and get the best model:

tuner.search(x_train, y_train, epochs=5, validation_data=(x_val, y_val))
best_model = tuner.get_best_models()[0]

To learn more about KerasTuner, check out this starter guide.


Contributing Guide

Please refer to the CONTRIBUTING.md for the contributing guide.


Community

Please use the Keras Slack workspace, the #keras-tuner channel for communication.

Use this link to request an invitation to the channel.


Citing KerasTuner

If KerasTuner helps your research, we appreciate your citations. Here is the BibTeX entry:

@misc{omalley2019kerastuner,
	title        = {KerasTuner},
	author       = {O'Malley, Tom and Bursztein, Elie and Long, James and Chollet, Fran\c{c}ois and Jin, Haifeng and Invernizzi, Luca and others},
	year         = 2019,
	howpublished = {\url{https://github.com/keras-team/keras-tuner}}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].