RandwirennImplementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (-73.06%)
Real Time Networkreal-time network architecture for mobile devices and semantic segmentation
Stars: ✭ 308 (-87.71%)
NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+326.9%)
DevolGenetic neural architecture search with Keras
Stars: ✭ 925 (-63.09%)
regnet.pytorchPyTorch-style and human-readable RegNet with a spectrum of pre-trained models
Stars: ✭ 50 (-98%)
PetridishnnCode for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-95.53%)
Autogan[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
Stars: ✭ 388 (-84.52%)
SgasSGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-94.53%)
RandwirennPytorch Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 270 (-89.23%)
Tenas[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang
Stars: ✭ 63 (-97.49%)
Neural Architecture Search With RlMinimal Tensorflow implementation of the paper "Neural Architecture Search With Reinforcement Learning" presented at ICLR 2017
Stars: ✭ 37 (-98.52%)
syne-tuneLarge scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (-95.81%)
AmlaAutoML frAmework for Neural Networks
Stars: ✭ 119 (-95.25%)
Awesome Automl And Lightweight ModelsA list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (-72.43%)
Deep architect legacyDeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (-94.25%)
Hpbandstera distributed Hyperband implementation on Steroids
Stars: ✭ 456 (-81.8%)
GraphnasThis directory contains code necessary to run the GraphNAS algorithm.
Stars: ✭ 104 (-95.85%)
AdanetFast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+33.28%)
LycorisA lightweight and easy-to-use deep learning framework with neural architecture search.
Stars: ✭ 180 (-92.82%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+56.42%)
HydraMulti-Task Learning Framework on PyTorch. State-of-the-art methods are implemented to effectively train models on multiple tasks.
Stars: ✭ 87 (-96.53%)
DeepswarmNeural Architecture Search Powered by Swarm Intelligence 🐜
Stars: ✭ 263 (-89.51%)
Awesome AutodlA curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (-27.41%)
ViPNASThe official repo for CVPR2021——ViPNAS: Efficient Video Pose Estimation via Neural Architecture Search.
Stars: ✭ 32 (-98.72%)
EfficientnasTowards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-98.24%)
rnn darts fastaiImplement Differentiable Architecture Search (DARTS) for RNN with fastai
Stars: ✭ 21 (-99.16%)
NasbotNeural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-95.21%)
Morph NetFast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (-62.61%)
DnaBlock-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-94.13%)
Slimmable networksSlimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019
Stars: ✭ 708 (-71.75%)
DeephyperDeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-95.33%)
PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (-72.98%)
NaszillaNaszilla is a Python library for neural architecture search (NAS)
Stars: ✭ 181 (-92.78%)
Deep architectA general, modular, and programmable architecture search framework
Stars: ✭ 110 (-95.61%)
Fasterseg[ICLR 2020] "FasterSeg: Searching for Faster Real-time Semantic Segmentation" by Wuyang Chen, Xinyu Gong, Xianming Liu, Qian Zhang, Yuan Li, Zhangyang Wang
Stars: ✭ 438 (-82.52%)
Scarlet NasBridging the gap Between Stability and Scalability in Neural Architecture Search
Stars: ✭ 140 (-94.41%)
Neural Architecture SearchBasic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv.org/abs/1611.01578).
Stars: ✭ 352 (-85.95%)
Pnasnet.tfTensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-95.93%)
DartsDifferentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+38.19%)
AtomnasCode for ICLR 2020 paper 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'
Stars: ✭ 197 (-92.14%)
Pnasnet.pytorchPyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (-87.67%)
Robnets[CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
Stars: ✭ 95 (-96.21%)
Awesome Automl PapersA curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+27.61%)
Single Path One Shot Nas MxnetSingle Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-94.57%)
ArchaiReproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (-89.39%)
Autodl ProjectsAutomated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (-52.63%)
NEATESTNEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training
Stars: ✭ 13 (-99.48%)
Nsga NetNSGA-Net, a Neural Architecture Search Algorithm
Stars: ✭ 171 (-93.18%)
InterstellarInterstellar: Searching Recurrent Architecture for Knowledge Graph Embedding. NeurIPS 2020.
Stars: ✭ 28 (-98.88%)
Mtlnas[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-97.69%)
nas-encodingsEncodings for neural architecture search
Stars: ✭ 29 (-98.84%)
Nas Segm PytorchCode for Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells, CVPR '19
Stars: ✭ 126 (-94.97%)
Nsganetv2[ECCV2020] NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search
Stars: ✭ 52 (-97.92%)
HyperactiveA hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (-92.74%)
Aw nasaw_nas: A Modularized and Extensible NAS Framework
Stars: ✭ 152 (-93.93%)
Nas Benchmark"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-94.97%)
AutokerasAutoML library for deep learning
Stars: ✭ 8,269 (+229.97%)