AdanetFast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+1853.22%)
DeepswarmNeural Architecture Search Powered by Swarm Intelligence 🐜
Stars: ✭ 263 (+53.8%)
Hpbandstera distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+166.67%)
TF-NASTF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-61.4%)
HydraMulti-Task Learning Framework on PyTorch. State-of-the-art methods are implemented to effectively train models on multiple tasks.
Stars: ✭ 87 (-49.12%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+2192.4%)
AmlaAutoML frAmework for Neural Networks
Stars: ✭ 119 (-30.41%)
ViPNASThe official repo for CVPR2021——ViPNAS: Efficient Video Pose Estimation via Neural Architecture Search.
Stars: ✭ 32 (-81.29%)
Neural Architecture Search With RlMinimal Tensorflow implementation of the paper "Neural Architecture Search With Reinforcement Learning" presented at ICLR 2017
Stars: ✭ 37 (-78.36%)
RandwirennImplementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (+294.74%)
NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6156.14%)
Autogan[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
Stars: ✭ 388 (+126.9%)
Nas Benchmark"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-26.32%)
Real Time Networkreal-time network architecture for mobile devices and semantic segmentation
Stars: ✭ 308 (+80.12%)
Tenas[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang
Stars: ✭ 63 (-63.16%)
RandwirennPytorch Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 270 (+57.89%)
SgasSGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-19.88%)
regnet.pytorchPyTorch-style and human-readable RegNet with a spectrum of pre-trained models
Stars: ✭ 50 (-70.76%)
AutokerasAutoML library for deep learning
Stars: ✭ 8,269 (+4735.67%)
syne-tuneLarge scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-38.6%)
PetridishnnCode for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-34.5%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-26.9%)
DevolGenetic neural architecture search with Keras
Stars: ✭ 925 (+440.94%)
PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+295.91%)
nn-MeterA DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Stars: ✭ 211 (+23.39%)
Pnasnet.tfTensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-40.35%)
Nas Segm PytorchCode for Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells, CVPR '19
Stars: ✭ 126 (-26.32%)
Fasterseg[ICLR 2020] "FasterSeg: Searching for Faster Real-time Semantic Segmentation" by Wuyang Chen, Xinyu Gong, Xianming Liu, Qian Zhang, Yuan Li, Zhangyang Wang
Stars: ✭ 438 (+156.14%)
Robnets[CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
Stars: ✭ 95 (-44.44%)
Neural Architecture SearchBasic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv.org/abs/1611.01578).
Stars: ✭ 352 (+105.85%)
Scarlet NasBridging the gap Between Stability and Scalability in Neural Architecture Search
Stars: ✭ 140 (-18.13%)
DartsDifferentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+1925.15%)
Autodl ProjectsAutomated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+594.15%)
Pnasnet.pytorchPyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+80.7%)
NasbotNeural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-29.82%)
Awesome Automl PapersA curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+1770.18%)
Mtlnas[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-66.08%)
ArchaiReproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (+55.56%)
DnaBlock-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-14.04%)
NEATESTNEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training
Stars: ✭ 13 (-92.4%)
Nsganetv2[ECCV2020] NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search
Stars: ✭ 52 (-69.59%)
InterstellarInterstellar: Searching Recurrent Architecture for Knowledge Graph Embedding. NeurIPS 2020.
Stars: ✭ 28 (-83.63%)
DeephyperDeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-31.58%)
nas-encodingsEncodings for neural architecture search
Stars: ✭ 29 (-83.04%)
EfficientnasTowards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-74.27%)
rnn darts fastaiImplement Differentiable Architecture Search (DARTS) for RNN with fastai
Stars: ✭ 21 (-87.72%)
Single Path One Shot Nas MxnetSingle Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-20.47%)
mindwareAn efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-80.12%)
Morph NetFast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+447.95%)
ESNACLearnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-84.21%)
Deep architectA general, modular, and programmable architecture search framework
Stars: ✭ 110 (-35.67%)
Slimmable networksSlimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019
Stars: ✭ 708 (+314.04%)
Aw nasaw_nas: A Modularized and Extensible NAS Framework
Stars: ✭ 152 (-11.11%)
Deep architect legacyDeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (-15.79%)
Awesome AutodlA curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+963.74%)
GraphnasThis directory contains code necessary to run the GraphNAS algorithm.
Stars: ✭ 104 (-39.18%)
Awesome Automl And Lightweight ModelsA list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+304.09%)