NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+8458.4%)
Autodl ProjectsAutomated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+849.6%)
nas-encodingsEncodings for neural architecture search
Stars: ✭ 29 (-76.8%)
Awesome AutodlA curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+1355.2%)
Awesome Automl And Lightweight ModelsA list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+452.8%)
HypernetsA General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+76.8%)
Nas Benchmark"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (+0.8%)
NaszillaNaszilla is a Python library for neural architecture search (NAS)
Stars: ✭ 181 (+44.8%)
DnaBlock-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (+17.6%)
Awesome Automl PapersA curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+2458.4%)
TF-NASTF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-47.2%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-81.6%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+3036%)
DevolGenetic neural architecture search with Keras
Stars: ✭ 925 (+640%)
Hpbandstera distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+264.8%)
EfficientnasTowards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-64.8%)
PetridishnnCode for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-10.4%)
Mtlnas[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-53.6%)
AutokerasAutoML library for deep learning
Stars: ✭ 8,269 (+6515.2%)
DeephyperDeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-6.4%)
AmlaAutoML frAmework for Neural Networks
Stars: ✭ 119 (-4.8%)
AutodeeplabAutoDeeplab / auto-deeplab / AutoML for semantic segmentation, implemented in Pytorch
Stars: ✭ 269 (+115.2%)
Nas Bench 201NAS-Bench-201 API and Instruction
Stars: ✭ 537 (+329.6%)
PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+441.6%)
AutodlAutomated Deep Learning without ANY human intervention. 1'st Solution for AutoDL [email protected]
Stars: ✭ 854 (+583.2%)
Shape AdaptorThe implementation of "Shape Adaptor: A Learnable Resizing Module" [ECCV 2020].
Stars: ✭ 59 (-52.8%)
Once For All[ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment
Stars: ✭ 1,127 (+801.6%)
ArchaiReproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (+112.8%)
AutoSpeech[InterSpeech 2020] "AutoSpeech: Neural Architecture Search for Speaker Recognition" by Shaojin Ding*, Tianlong Chen*, Xinyu Gong, Weiwei Zha, Zhangyang Wang
Stars: ✭ 195 (+56%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (+55.2%)
HyperKerasAn AutoDL tool for Neural Architecture Search and Hyperparameter Optimization on Tensorflow and Keras
Stars: ✭ 29 (-76.8%)
AdanetFast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+2572%)
DartsDifferentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+2670.4%)
Morph NetFast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+649.6%)
Pnasnet.pytorchPyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+147.2%)
Pnasnet.tfTensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-18.4%)
SgasSGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (+9.6%)
FairdartsFair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search
Stars: ✭ 145 (+16%)
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+127.2%)
CM-NASCM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)
Stars: ✭ 39 (-68.8%)
nn-MeterA DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Stars: ✭ 211 (+68.8%)
GCLList of Publications in Graph Contrastive Learning
Stars: ✭ 25 (-80%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-28.8%)
pcdarts-tf2PC-DARTS (PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search, published in ICLR 2020) implemented in Tensorflow 2.0+. This is an unofficial implementation.
Stars: ✭ 25 (-80%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-83.2%)
sspendersspender helps you suspend your server (and spindown disks) when idle.
Stars: ✭ 21 (-83.2%)
ESNACLearnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-78.4%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-51.2%)
graph-transformer-pytorchImplementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (-35.2%)
newtNatural World Tasks
Stars: ✭ 24 (-80.8%)
wenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 2,384 (+1807.2%)
AGD[ICML2020] "AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks" by Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin, Zhangyang Wang
Stars: ✭ 98 (-21.6%)