ProxGradPytorchPyTorch implementation of Proximal Gradient Algorithms a la Parikh and Boyd (2014). Useful for Auto-Sizing (Murray and Chiang 2015, Murray et al. 2019).
Stars: ✭ 28 (-97.61%)
Meta-SACAuto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-98.38%)
tunetaIntelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-93.42%)
Far HoGradient based hyperparameter optimization & meta-learning package for TensorFlow
Stars: ✭ 161 (-86.25%)
cmaesPython library for CMA Evolution Strategy.
Stars: ✭ 174 (-85.14%)
TscvTime Series Cross-Validation -- an extension for scikit-learn
Stars: ✭ 145 (-87.62%)
cliPolyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (-98.46%)
OptunaA hyperparameter optimization framework
Stars: ✭ 5,679 (+384.97%)
Deep architect legacyDeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (-87.7%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+234.76%)
miraimlMiraiML: asynchronous, autonomous and continuous Machine Learning in Python
Stars: ✭ 23 (-98.04%)
shadhoScalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (-98.55%)
RobynRobyn is an experimental, automated and open-sourced Marketing Mix Modeling (MMM) package from Facebook Marketing Science. It uses various machine learning techniques (Ridge regression with cross validation, multi-objective evolutionary algorithm for hyperparameter optimisation, gradient-based optimisation for budget allocation etc.) to define m…
Stars: ✭ 433 (-63.02%)
Deep trafficMIT DeepTraffic top 2% solution (75.01 mph) 🚗.
Stars: ✭ 47 (-95.99%)
go-bayesoptA library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.
Stars: ✭ 47 (-95.99%)
mlrHyperoptEasy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (-97.44%)
Deeplearning NotesNotes for Deep Learning Specialization Courses led by Andrew Ng.
Stars: ✭ 126 (-89.24%)
scicloj.mlA Clojure machine learning library
Stars: ✭ 152 (-87.02%)
hyper-enginePython library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-93.17%)
Bayesian OptimizationPython code for bayesian optimization using Gaussian processes
Stars: ✭ 245 (-79.08%)
Awesome Automl And Lightweight ModelsA list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (-40.99%)
osprey🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (-93.94%)
Boml Bilevel Optimization Library in Python for Multi-Task and Meta Learning
Stars: ✭ 120 (-89.75%)
Hyperopt.jlHyperparameter optimization in Julia.
Stars: ✭ 144 (-87.7%)
DeephyperDeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-90.01%)
HyperactiveA hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (-84.46%)
ChocolateA fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-90.44%)
autotuneAutonomous Performance Tuning for Kubernetes !
Stars: ✭ 84 (-92.83%)
Deep architectA general, modular, and programmable architecture search framework
Stars: ✭ 110 (-90.61%)
optuna-allennlp🚀 A demonstration of hyperparameter optimization using Optuna for models implemented with AllenNLP.
Stars: ✭ 17 (-98.55%)
BtbA simple, extensible library for developing AutoML systems
Stars: ✭ 159 (-86.42%)
HyperboardA web-based dashboard for Deep Learning
Stars: ✭ 336 (-71.31%)
Mlmodelsmlmodels : Machine Learning and Deep Learning Model ZOO for Pytorch, Tensorflow, Keras, Gluon models...
Stars: ✭ 145 (-87.62%)
cerebro-systemData System for Optimized Deep Learning Model Selection
Stars: ✭ 15 (-98.72%)
NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+813.58%)
honeA shell-friendly hyperparameter search tool inspired by Optuna
Stars: ✭ 17 (-98.55%)
HordEfficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates
Stars: ✭ 99 (-91.55%)
ultraoptDistributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (-92.06%)
Auto ml[UNMAINTAINED] Automated machine learning for analytics & production
Stars: ✭ 1,559 (+33.13%)
XcessivA web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.
Stars: ✭ 1,255 (+7.17%)
HypertunityA toolset for black-box hyperparameter optimisation.
Stars: ✭ 119 (-89.84%)
divinerDiviner is a serverless machine learning and hyper parameter tuning platform
Stars: ✭ 19 (-98.38%)
Tune SklearnA drop-in replacement for Scikit-Learn’s GridSearchCV / RandomizedSearchCV -- but with cutting edge hyperparameter tuning techniques.
Stars: ✭ 241 (-79.42%)
AdatuneGradient based Hyperparameter Tuning library in PyTorch
Stars: ✭ 226 (-80.7%)
TalosHyperparameter Optimization for TensorFlow, Keras and PyTorch
Stars: ✭ 1,382 (+18.02%)
allennlp-optuna⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (-97.18%)
Awesome Automl PapersA curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+173.1%)
CmaesPython library for CMA Evolution Strategy.
Stars: ✭ 88 (-92.49%)
optuna-examplesExamples for https://github.com/optuna/optuna
Stars: ✭ 238 (-79.68%)
FacetHuman-explainable AI.
Stars: ✭ 269 (-77.03%)
keras-hypetuneA friendly python package for Keras Hyperparameters Tuning based only on NumPy and hyperopt.
Stars: ✭ 47 (-95.99%)
TpotA Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
Stars: ✭ 8,378 (+615.46%)
Mljar SupervisedAutomated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (-17.93%)
iraceIterated Racing for Automatic Algorithm Configuration
Stars: ✭ 26 (-97.78%)
Deeplearning.ai NotesThese are my notes which I prepared during deep learning specialization taught by AI guru Andrew NG. I have used diagrams and code snippets from the code whenever needed but following The Honor Code.
Stars: ✭ 262 (-77.63%)