XcessivA web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.
Stars: ✭ 1,255 (+945.83%)
MatchingnetworksThis repo provides pytorch code which replicates the results of the Matching Networks for One Shot Learning paper on the Omniglot and MiniImageNet dataset
Stars: ✭ 256 (+113.33%)
Mt NetCode accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"
Stars: ✭ 30 (-75%)
HebbianMetaLearningMeta-Learning through Hebbian Plasticity in Random Networks: https://arxiv.org/abs/2007.02686
Stars: ✭ 77 (-35.83%)
TalosHyperparameter Optimization for TensorFlow, Keras and PyTorch
Stars: ✭ 1,382 (+1051.67%)
tunetaIntelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-35.83%)
TransferlearningTransfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+6967.5%)
Learn2learnA PyTorch Library for Meta-learning Research
Stars: ✭ 1,193 (+894.17%)
dropclass speakerDropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-83.33%)
LooperA resource list for causality in statistics, data science and physics
Stars: ✭ 23 (-80.83%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-5%)
ChocolateA fully decentralized hyperparameter optimization framework
Stars: ✭ 112 (-6.67%)
Open-L2OOpen-L2O: A Comprehensive and Reproducible Benchmark for Learning to Optimize Algorithms
Stars: ✭ 108 (-10%)
Learningtocompare fslPyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
Stars: ✭ 837 (+597.5%)
MeTALOfficial PyTorch implementation of "Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning" (ICCV2021 Oral)
Stars: ✭ 24 (-80%)
MgoPurely functional genetic algorithms for multi-objective optimisation
Stars: ✭ 63 (-47.5%)
polystoresA library for performing hyperparameter optimization
Stars: ✭ 48 (-60%)
Gradient Free OptimizersSimple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Stars: ✭ 711 (+492.5%)
NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+8815%)
Maml TfTensorflow Implementation of MAML
Stars: ✭ 44 (-63.33%)
HypersearchHyperparameter optimization for PyTorch.
Stars: ✭ 376 (+213.33%)
Test TubePython library to easily log experiments and parallelize hyperparameter search for neural networks
Stars: ✭ 663 (+452.5%)
hyper-enginePython library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (-33.33%)
BayesoSimple, but essential Bayesian optimization package
Stars: ✭ 57 (-52.5%)
Hyperopt.jlHyperparameter optimization in Julia.
Stars: ✭ 144 (+20%)
DeephyperDeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-2.5%)
bboptBlack box hyperparameter optimization made easy.
Stars: ✭ 66 (-45%)
HyperbandTuning hyperparams fast with Hyperband
Stars: ✭ 555 (+362.5%)
syne-tuneLarge scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-12.5%)
cmaesPython library for CMA Evolution Strategy.
Stars: ✭ 174 (+45%)
AtmAuto Tune Models - A multi-tenant, multi-data system for automated machine learning (model selection and tuning).
Stars: ✭ 504 (+320%)
miraimlMiraiML: asynchronous, autonomous and continuous Machine Learning in Python
Stars: ✭ 23 (-80.83%)
Gnn Meta AttackImplementation of the paper "Adversarial Attacks on Graph Neural Networks via Meta Learning".
Stars: ✭ 99 (-17.5%)
Meta-DETRMeta-DETR: Official PyTorch Implementation
Stars: ✭ 205 (+70.83%)
Meta DatasetA dataset of datasets for learning to learn from few examples
Stars: ✭ 483 (+302.5%)
autotuneAutonomous Performance Tuning for Kubernetes !
Stars: ✭ 84 (-30%)
G MetaGraph meta learning via local subgraphs (NeurIPS 2020)
Stars: ✭ 50 (-58.33%)
MetaHeacThis is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).
Stars: ✭ 36 (-70%)
OptunaA hyperparameter optimization framework
Stars: ✭ 5,679 (+4632.5%)
honeA shell-friendly hyperparameter search tool inspired by Optuna
Stars: ✭ 17 (-85.83%)
Meta BlocksA modular toolbox for meta-learning research with a focus on speed and reproducibility.
Stars: ✭ 110 (-8.33%)
mltbMachine Learning Tool Box
Stars: ✭ 25 (-79.17%)
SimpleExperimental Global Optimization Algorithm
Stars: ✭ 450 (+275%)
LearningToCompare-TensorflowTensorflow implementation for paper: Learning to Compare: Relation Network for Few-Shot Learning.
Stars: ✭ 17 (-85.83%)
L2p GnnCodes and datasets for AAAI-2021 paper "Learning to Pre-train Graph Neural Networks"
Stars: ✭ 48 (-60%)
Meta Transfer LearningTensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
Stars: ✭ 439 (+265.83%)
textlearnRA simple collection of well working NLP models (Keras, H2O, StarSpace) tuned and benchmarked on a variety of datasets.
Stars: ✭ 16 (-86.67%)
Hyperopt Keras Cnn Cifar 100Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another classification task.
Stars: ✭ 95 (-20.83%)
NeuraxleA Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
Stars: ✭ 377 (+214.17%)
HypertunityA toolset for black-box hyperparameter optimisation.
Stars: ✭ 119 (-0.83%)
FewshotnlpThe source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".
Stars: ✭ 115 (-4.17%)
What I Have ReadPaper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
Stars: ✭ 110 (-8.33%)
CmaesPython library for CMA Evolution Strategy.
Stars: ✭ 88 (-26.67%)
Mljar SupervisedAutomated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+700.83%)