cliPolyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (-95.21%)
HypernetsA General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-41.22%)
osprey🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (-81.12%)
maggyDistribution transparent Machine Learning experiments on Apache Spark
Stars: ✭ 83 (-77.93%)
OrionAsynchronous Distributed Hyperparameter Optimization.
Stars: ✭ 186 (-50.53%)
optuna-allennlp🚀 A demonstration of hyperparameter optimization using Optuna for models implemented with AllenNLP.
Stars: ✭ 17 (-95.48%)
mlr3tuningHyperparameter optimization package of the mlr3 ecosystem
Stars: ✭ 44 (-88.3%)
RobynRobyn is an experimental, automated and open-sourced Marketing Mix Modeling (MMM) package from Facebook Marketing Science. It uses various machine learning techniques (Ridge regression with cross validation, multi-objective evolutionary algorithm for hyperparameter optimisation, gradient-based optimisation for budget allocation etc.) to define m…
Stars: ✭ 433 (+15.16%)
GpflowoptBayesian Optimization using GPflow
Stars: ✭ 229 (-39.1%)
optuna-examplesExamples for https://github.com/optuna/optuna
Stars: ✭ 238 (-36.7%)
randoptStreamlined machine learning experiment management.
Stars: ✭ 108 (-71.28%)
AuptimizerAn automatic ML model optimization tool.
Stars: ✭ 166 (-55.85%)
bboptBlack box hyperparameter optimization made easy.
Stars: ✭ 66 (-82.45%)
naturalselectionA general-purpose pythonic genetic algorithm.
Stars: ✭ 17 (-95.48%)
FEDOTAutomated modeling and machine learning framework FEDOT
Stars: ✭ 312 (-17.02%)
mangoParallel Hyperparameter Tuning in Python
Stars: ✭ 241 (-35.9%)
miraimlMiraiML: asynchronous, autonomous and continuous Machine Learning in Python
Stars: ✭ 23 (-93.88%)
optkerasOptKeras: wrapper around Keras and Optuna for hyperparameter optimization
Stars: ✭ 29 (-92.29%)
RayAn open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.
Stars: ✭ 18,547 (+4832.71%)
allennlp-optuna⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (-91.22%)
Cornell MoeA Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.
Stars: ✭ 198 (-47.34%)
HyperasKeras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Stars: ✭ 2,110 (+461.17%)
ProxGradPytorchPyTorch implementation of Proximal Gradient Algorithms a la Parikh and Boyd (2014). Useful for Auto-Sizing (Murray and Chiang 2015, Murray et al. 2019).
Stars: ✭ 28 (-92.55%)
Rl Baselines3 ZooA collection of pre-trained RL agents using Stable Baselines3, training and hyperparameter optimization included.
Stars: ✭ 161 (-57.18%)
codeflareSimplifying the definition and execution, scaling and deployment of pipelines on the cloud.
Stars: ✭ 163 (-56.65%)
ml-pipelineUsing Kafka-Python to illustrate a ML production pipeline
Stars: ✭ 90 (-76.06%)
tunetaIntelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-79.52%)
syne-tuneLarge scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-72.07%)
bbaiSet model hyperparameters using deterministic, exact algorithms.
Stars: ✭ 19 (-94.95%)
Awesome Automl PapersA curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+750.53%)
cmaesPython library for CMA Evolution Strategy.
Stars: ✭ 174 (-53.72%)
shadhoScalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (-95.48%)
polystoresA library for performing hyperparameter optimization
Stars: ✭ 48 (-87.23%)
go-bayesoptA library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.
Stars: ✭ 47 (-87.5%)
ultraoptDistributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-75.27%)
scikit-hyperbandA scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (-81.91%)
SherpaHyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Stars: ✭ 289 (-23.14%)
scicloj.mlA Clojure machine learning library
Stars: ✭ 152 (-59.57%)
autotuneAutonomous Performance Tuning for Kubernetes !
Stars: ✭ 84 (-77.66%)
Bayesian OptimizationPython code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (-34.84%)
mlrHyperoptEasy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (-92.02%)
LaleLibrary for Semi-Automated Data Science
Stars: ✭ 198 (-47.34%)
honeA shell-friendly hyperparameter search tool inspired by Optuna
Stars: ✭ 17 (-95.48%)
Coursera Deep Learning SpecializationNotes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-50%)
Meta-SACAuto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-94.95%)
HyperactiveA hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (-51.6%)
mltbMachine Learning Tool Box
Stars: ✭ 25 (-93.35%)
MlrmboToolbox for Bayesian Optimization and Model-Based Optimization in R
Stars: ✭ 173 (-53.99%)
hyper-enginePython library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-78.72%)
keras-hypetuneA friendly python package for Keras Hyperparameters Tuning based only on NumPy and hyperopt.
Stars: ✭ 47 (-87.5%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+942.55%)
mindwareAn efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-90.96%)
Hyperopt.jlHyperparameter optimization in Julia.
Stars: ✭ 144 (-61.7%)
textlearnRA simple collection of well working NLP models (Keras, H2O, StarSpace) tuned and benchmarked on a variety of datasets.
Stars: ✭ 16 (-95.74%)