All Projects → LiYangHart → Hyperparameter Optimization Of Machine Learning Algorithms

LiYangHart / Hyperparameter Optimization Of Machine Learning Algorithms

Licence: mit
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)

Projects that are alternatives of or similar to Hyperparameter Optimization Of Machine Learning Algorithms

Scikit Optimize
Sequential model-based optimization with a `scipy.optimize` interface
Stars: ✭ 2,258 (+337.6%)
Mutual labels:  bayesian-optimization, optimization, hyperparameter-optimization, hyperparameter-tuning
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-81.98%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization
NeuroEvolution-Flappy-Bird
A comparison between humans, neuroevolution and multilayer perceptrons playing Flapy Bird implemented in Python
Stars: ✭ 17 (-96.71%)
Mutual labels:  genetic-algorithm, machine-learning-algorithms, artificial-neural-networks
Sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (-43.99%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-53.29%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
naturalselection
A general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (-96.71%)
Mutual labels:  genetic-algorithm, hyperparameter-optimization, hyperparameter-tuning
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-79.65%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Fantasy Basketball
Scraping statistics, predicting NBA player performance with neural networks and boosting algorithms, and optimising lineups for Draft Kings with genetic algorithm. Capstone Project for Machine Learning Engineer Nanodegree by Udacity.
Stars: ✭ 146 (-71.71%)
Mutual labels:  jupyter-notebook, genetic-algorithm, optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-93.41%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Deeplearning.ai Notes
These are my notes which I prepared during deep learning specialization taught by AI guru Andrew NG. I have used diagrams and code snippets from the code whenever needed but following The Honor Code.
Stars: ✭ 262 (-49.22%)
Mutual labels:  jupyter-notebook, hyperparameter-tuning, optimization
Python Ml Course
Curso de Introducción a Machine Learning con Python
Stars: ✭ 442 (-14.34%)
Mutual labels:  jupyter-notebook, svm, artificial-neural-networks
Bayesian Optimization
Python code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (-52.52%)
Mutual labels:  jupyter-notebook, hyperparameter-optimization, optimization
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-63.57%)
Mutual labels:  jupyter-notebook, hyperparameter-optimization, hyperparameter-tuning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-95.74%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
Far Ho
Gradient based hyperparameter optimization & meta-learning package for TensorFlow
Stars: ✭ 161 (-68.8%)
Mutual labels:  jupyter-notebook, hyperparameter-optimization, optimization
zoofs
zoofs is a python library for performing feature selection using a variety of nature-inspired wrapper algorithms. The algorithms range from swarm-intelligence to physics-based to Evolutionary. It's easy to use , flexible and powerful tool to reduce your feature size.
Stars: ✭ 142 (-72.48%)
Mutual labels:  optimization, genetic-algorithm, machine-learning-algorithms
Pytorch classification
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Stars: ✭ 395 (-23.45%)
Mutual labels:  jupyter-notebook, svm, random-forest
Machine Learning Without Any Libraries
This is a collection of some of the important machine learning algorithms which are implemented with out using any libraries. Libraries such as numpy and pandas are used to improve computational complexity of algorithms
Stars: ✭ 77 (-85.08%)
Mutual labels:  jupyter-notebook, machine-learning-algorithms, artificial-neural-networks
Pysot
Surrogate Optimization Toolbox for Python
Stars: ✭ 136 (-73.64%)
Mutual labels:  jupyter-notebook, bayesian-optimization, optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (-72.09%)
Mutual labels:  optimization, hyperparameter-optimization, bayesian-optimization

Hyperparameter Optimization of Machine Learning Algorithms

This code provides a hyper-parameter optimization implementation for machine learning algorithms, as described in the paper "On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice".

To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model's performance. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-the-art optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization.

This paper and code will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively.

Paper

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice
One-column version: arXiv
Two-column version: Elsevier

Quick Navigation

Section 3: Important hyper-parameters of common machine learning algorithms
Section 4: Hyper-parameter optimization techniques introduction
Section 5: How to choose optimization techniques for different machine learning models
Section 6: Common Python libraries/tools for hyper-parameter optimization
Section 7: Experimental results (sample code in "HPO_Regression.ipynb" and "HPO_Classification.ipynb")
Section 8: Open challenges and future research directions
Summary table for Sections 3-6: Table 2: A comprehensive overview of common ML models, their hyper-parameters, suitable optimization techniques, and available Python libraries
Summary table for Sections 8: Table 10: The open challenges and future directions of HPO research

Implementation

Sample code for hyper-parameter optimization implementation for machine learning algorithms is provided in this repository.

Sample code for Regression problems

HPO_Regression.ipynb
Dataset used: Boston-Housing

Sample code for Classification problems

HPO_Classification.ipynb
Dataset used: MNIST

Machine Learning & Deep Learning Algorithms

  • Random forest (RF)
  • Support vector machine (SVM)
  • K-nearest neighbor (KNN)
  • Artificial Neural Networks (ANN)

Hyperparameter Configuration Space

ML Model Hyper-parameter Type Search Space
RF Classifier n_estimators Discrete [10,100]
max_depth Discrete [5,50]
min_samples_split Discrete [2,11]
min_samples_leaf Discrete [1,11]
criterion Categorical 'gini', 'entropy'
max_features Discrete [1,64]
SVM Classifier C Continuous [0.1,50]
kernel Categorical 'linear', 'poly', 'rbf', 'sigmoid'
KNN Classifier n_neighbors Discrete [1,20]
ANN Classifier optimizer Categorical 'adam', 'rmsprop', 'sgd'
activation Categorical 'relu', 'tanh'
batch_size Discrete [16,64]
neurons Discrete [10,100]
epochs Discrete [20,50]
patience Discrete [3,20]
RF Regressor n_estimators Discrete [10,100]
max_depth Discrete [5,50]
min_samples_split Discrete [2,11]
min_samples_leaf Discrete [1,11]
criterion Categorical 'mse', 'mae'
max_features Discrete [1,13]
SVM Regressor C Continuous [0.1,50]
kernel Categorical 'linear', 'poly', 'rbf', 'sigmoid'
epsilon Continuous [0.001,1]
KNN Regressor n_neighbors Discrete [1,20]
ANN Regressor optimizer Categorical 'adam', 'rmsprop'
activation Categorical 'relu', 'tanh'
loss Categorical 'mse', 'mae'
batch_size Discrete [16,64]
neurons Discrete [10,100]
epochs Discrete [20,50]
patience Discrete [3,20]

HPO Algorithms

  • Grid search
  • Random search
  • Hyperband
  • Bayesian Optimization with Gaussian Processes (BO-GP)
  • Bayesian Optimization with Tree-structured Parzen Estimator (BO-TPE)
  • Particle swarm optimization (PSO)
  • Genetic algorithm (GA)

Requirements

Contact-Info

Please feel free to contact me for any questions or cooperation opportunities. I'd be happy to help.

Citation

If you find this repository useful in your research, please cite this article as:

L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020, doi: https://doi.org/10.1016/j.neucom.2020.07.061.

@article{YANG2020295,
title = "On hyperparameter optimization of machine learning algorithms: Theory and practice",
author = "Li Yang and Abdallah Shami",
volume = "415",
pages = "295 - 316",
journal = "Neurocomputing",
year = "2020",
issn = "0925-2312",
doi = "https://doi.org/10.1016/j.neucom.2020.07.061",
url = "http://www.sciencedirect.com/science/article/pii/S0925231220311693"
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].