osprey🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (-28.28%)
Predictive Maintenance Using LstmExample of Multiple Multivariate Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras.
Stars: ✭ 352 (+255.56%)
allennlp-optuna⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (-66.67%)
MgoPurely functional genetic algorithms for multi-objective optimisation
Stars: ✭ 63 (-36.36%)
optuna-examplesExamples for https://github.com/optuna/optuna
Stars: ✭ 238 (+140.4%)
Gradient Free OptimizersSimple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+618.18%)
textlearnRA simple collection of well working NLP models (Keras, H2O, StarSpace) tuned and benchmarked on a variety of datasets.
Stars: ✭ 16 (-83.84%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+3859.6%)
ProxGradPytorchPyTorch implementation of Proximal Gradient Algorithms a la Parikh and Boyd (2014). Useful for Auto-Sizing (Murray and Chiang 2015, Murray et al. 2019).
Stars: ✭ 28 (-71.72%)
CmaesPython library for CMA Evolution Strategy.
Stars: ✭ 88 (-11.11%)
ml-pipelineUsing Kafka-Python to illustrate a ML production pipeline
Stars: ✭ 90 (-9.09%)
Test TubePython library to easily log experiments and parallelize hyperparameter search for neural networks
Stars: ✭ 663 (+569.7%)
NumpyDLDeep Learning Library. For education. Based on pure Numpy. Support CNN, RNN, LSTM, GRU etc.
Stars: ✭ 206 (+108.08%)
bbaiSet model hyperparameters using deterministic, exact algorithms.
Stars: ✭ 19 (-80.81%)
TpotA Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
Stars: ✭ 8,378 (+8362.63%)
Meta-SACAuto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-80.81%)
sparsezooNeural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+166.67%)
shadhoScalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (-82.83%)
deep-blueberryIf you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-82.83%)
360sd NetPytorch implementation of ICRA 2020 paper "360° Stereo Depth Estimation with Learnable Cost Volume"
Stars: ✭ 94 (-5.05%)
optkerasOptKeras: wrapper around Keras and Optuna for hyperparameter optimization
Stars: ✭ 29 (-70.71%)
tunetaIntelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-22.22%)
HypernetsA General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+123.23%)
Auto SklearnAutomated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+5875.76%)
RayAn open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+18634.34%)
Deep-LearningIt contains the coursework and the practice I have done while learning Deep Learning.🚀 👨💻💥 🚩🌈
Stars: ✭ 21 (-78.79%)
GpflowoptBayesian Optimization using GPflow
Stars: ✭ 229 (+131.31%)
Deep trafficMIT DeepTraffic top 2% solution (75.01 mph) 🚗.
Stars: ✭ 47 (-52.53%)
Cornell MoeA Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (+100%)
polystoresA library for performing hyperparameter optimization
Stars: ✭ 48 (-51.52%)
OrionAsynchronous Distributed Hyperparameter Optimization.
Stars: ✭ 186 (+87.88%)
HyperbandTuning hyperparams fast with Hyperband
Stars: ✭ 555 (+460.61%)
HyperasKeras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Stars: ✭ 2,110 (+2031.31%)
AMP-RegularizerCode for our paper "Regularizing Neural Networks via Adversarial Model Perturbation", CVPR2021
Stars: ✭ 26 (-73.74%)
AuptimizerAn automatic ML model optimization tool.
Stars: ✭ 166 (+67.68%)
Far HoGradient based hyperparameter optimization & meta-learning package for TensorFlow
Stars: ✭ 161 (+62.63%)
TscvTime Series Cross-Validation -- an extension for scikit-learn
Stars: ✭ 145 (+46.46%)
AtmAuto Tune Models - A multi-tenant, multi-data system for automated machine learning (model selection and tuning).
Stars: ✭ 504 (+409.09%)
Deep architect legacyDeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (+45.45%)
hyper-enginePython library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-19.19%)
PbtPopulation Based Training (in PyTorch with sqlite3). Status: Unsupported
Stars: ✭ 138 (+39.39%)
Mljar SupervisedAutomated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+870.71%)
Hyperopt.jlHyperparameter optimization in Julia.
Stars: ✭ 144 (+45.45%)
Hyperopt Keras Cnn Cifar 100Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another classification task.
Stars: ✭ 95 (-4.04%)
Machine Learning AlgorithmsA curated list of almost all machine learning algorithms and deep learning algorithms grouped by category.
Stars: ✭ 92 (-7.07%)
DeterminedDetermined: Deep Learning Training Platform
Stars: ✭ 1,171 (+1082.83%)
Rl Baselines ZooA collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Stars: ✭ 839 (+747.47%)
SimpleExperimental Global Optimization Algorithm
Stars: ✭ 450 (+354.55%)
syne-tuneLarge scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (+6.06%)