All Projects → D-X-Y → Autodl Projects

D-X-Y / Autodl Projects

Licence: mit
Automated deep learning algorithms implemented in PyTorch.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Autodl Projects

Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (-41.79%)
Mutual labels:  nas, automl, neural-architecture-search
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-89.39%)
Mutual labels:  nas, automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+801.26%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Nas Papers
Awesome Neural Architecture Search Papers
Stars: ✭ 213 (-82.06%)
Mutual labels:  nas, automl, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-97.81%)
Mutual labels:  nas, automl, neural-architecture-search
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-81.38%)
Mutual labels:  nas, automl, neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+53.24%)
Mutual labels:  nas, automl, neural-architecture-search
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-89.47%)
Mutual labels:  nas, automl, neural-architecture-search
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-97.56%)
Mutual labels:  nas, automl, neural-architecture-search
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (-73.97%)
Mutual labels:  automl, neural-architecture-search
Shape Adaptor
The implementation of "Shape Adaptor: A Learnable Resizing Module" [ECCV 2020].
Stars: ✭ 59 (-95.03%)
Mutual labels:  nas, automl
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (-21.06%)
Mutual labels:  automl, neural-architecture-search
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+230.24%)
Mutual labels:  automl, neural-architecture-search
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+169.42%)
Mutual labels:  automl, neural-architecture-search
Autodl
Automated Deep Learning without ANY human intervention. 1'st Solution for AutoDL [email protected]
Stars: ✭ 854 (-28.05%)
Mutual labels:  nas, automl
Autodeeplab
AutoDeeplab / auto-deeplab / AutoML for semantic segmentation, implemented in Pytorch
Stars: ✭ 269 (-77.34%)
Mutual labels:  nas, automl
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (-22.07%)
Mutual labels:  automl, neural-architecture-search
Autokeras
AutoML library for deep learning
Stars: ✭ 8,269 (+596.63%)
Mutual labels:  automl, neural-architecture-search
Archai
Reproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (-77.59%)
Mutual labels:  nas, neural-architecture-search
Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+191.74%)
Mutual labels:  automl, neural-architecture-search


MIT licensed

Automated Deep Learning Projects (AutoDL-Projects) is an open source, lightweight, but useful project for everyone. This project implemented several neural architecture search (NAS) and hyper-parameter optimization (HPO) algorithms. 中文介绍见README_CN.md

Who should consider using AutoDL-Projects

  • Beginners who want to try different AutoDL algorithms
  • Engineers who want to try AutoDL to investigate whether AutoDL works on your projects
  • Researchers who want to easily implement and experiement new AutoDL algorithms.

Why should we use AutoDL-Projects

  • Simple library dependencies
  • All algorithms are in the same codebase
  • Active maintenance

AutoDL-Projects Capabilities

At this moment, this project provides the following algorithms and scripts to run them. Please see the details in the link provided in the description column.

Type ABBRV Algorithms Description
NAS TAS Network Pruning via Transformable Architecture Search NeurIPS-2019-TAS.md
DARTS DARTS: Differentiable Architecture Search ICLR-2019-DARTS.md
GDAS Searching for A Robust Neural Architecture in Four GPU Hours CVPR-2019-GDAS.md
SETN One-Shot Neural Architecture Search via Self-Evaluated Template Network ICCV-2019-SETN.md
NAS-Bench-201 NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search NAS-Bench-201.md
NATS-Bench NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size NATS-Bench.md
... ENAS / REA / REINFORCE / BOHB Please check the original papers NAS-Bench-201.md NATS-Bench.md
HPO HPO-CG Hyperparameter optimization with approximate gradient coming soon
Basic ResNet Deep Learning-based Image Classification BASELINE.md

Requirements and Preparation

Please install Python>=3.6 and PyTorch>=1.3.0. (You could also run this project in lower versions of Python and PyTorch, but may have bugs). Some visualization codes may require opencv.

CIFAR and ImageNet should be downloaded and extracted into $TORCH_HOME. Some methods use knowledge distillation (KD), which require pre-trained models. Please download these models from Google Drive (or train by yourself) and save into .latent-data.

Please use

git clone --recurse-submodules [email protected]:D-X-Y/AutoDL-Projects.git

to download this repo with submodules.

Citation

If you find that this project helps your research, please consider citing the related paper:

@article{dong2020autohas,
  title={{AutoHAS}: Efficient Hyperparameter and Architecture Search},
  author={Dong, Xuanyi and Tan, Mingxing and Yu, Adams Wei and Peng, Daiyi and Gabrys, Bogdan and Le, Quoc V},
  journal={arXiv preprint arXiv:2006.03656},
  year={2020}
}
@article{dong2021nats,
  title   = {{NATS-Bench}: Benchmarking NAS Algorithms for Architecture Topology and Size},
  author  = {Dong, Xuanyi and Liu, Lu and Musial, Katarzyna and Gabrys, Bogdan},
  doi     = {10.1109/TPAMI.2021.3054824},
  journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
  year    = {2021},
  note    = {\mbox{doi}:\url{10.1109/TPAMI.2021.3054824}}
}
@inproceedings{dong2020nasbench201,
  title     = {{NAS-Bench-201}: Extending the Scope of Reproducible Neural Architecture Search},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {International Conference on Learning Representations (ICLR)},
  url       = {https://openreview.net/forum?id=HJxyZkBKDr},
  year      = {2020}
}
@inproceedings{dong2019tas,
  title     = {Network Pruning via Transformable Architecture Search},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Neural Information Processing Systems (NeurIPS)},
  pages     = {760--771},
  year      = {2019}
}
@inproceedings{dong2019one,
  title     = {One-Shot Neural Architecture Search via Self-Evaluated Template Network},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
  pages     = {3681--3690},
  year      = {2019}
}
@inproceedings{dong2019search,
  title     = {Searching for A Robust Neural Architecture in Four GPU Hours},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  pages     = {1761--1770},
  year      = {2019}
}

Others

If you want to contribute to this repo, please see CONTRIBUTING.md. Besides, please follow CODE-OF-CONDUCT.md.

We use black for Python code formatter. Please use black . -l 88.

License

The entire codebase is under the MIT license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].