All Projects → microsoft → Archai

microsoft / Archai

Licence: other
Reproducible Rapid Research for Neural Architecture Search (NAS)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Archai

TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-75.19%)
Mutual labels:  nas, neural-architecture-search
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-89.1%)
Mutual labels:  nas, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+3921.8%)
Mutual labels:  nas, neural-architecture-search
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+154.51%)
Mutual labels:  nas, neural-architecture-search
CM-NAS
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)
Stars: ✭ 39 (-85.34%)
Mutual labels:  nas, neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+346.24%)
Mutual labels:  nas, neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+583.83%)
Mutual labels:  nas, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+159.77%)
Mutual labels:  nas, neural-architecture-search
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-16.92%)
Mutual labels:  nas, neural-architecture-search
Awesome Nas Papers
Awesome Neural Architecture Search Papers
Stars: ✭ 213 (-19.92%)
Mutual labels:  nas, neural-architecture-search
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-52.63%)
Mutual labels:  nas, neural-architecture-search
deep-learning-roadmap
my own deep learning mastery roadmap
Stars: ✭ 40 (-84.96%)
Mutual labels:  nas, neural-architecture-search
Dna
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-44.74%)
Mutual labels:  nas, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-90.23%)
Mutual labels:  nas, neural-architecture-search
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-53.01%)
Mutual labels:  nas, neural-architecture-search
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+142.11%)
Mutual labels:  nas
Interstellar
Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding. NeurIPS 2020.
Stars: ✭ 28 (-89.47%)
Mutual labels:  neural-architecture-search
sherpa
a mini-package-manager for QNAP NAS
Stars: ✭ 63 (-76.32%)
Mutual labels:  nas
Arozos
General purposed Web Desktop Operating Platform / OS for Raspberry Pis, Now written in Go!
Stars: ✭ 252 (-5.26%)
Mutual labels:  nas
borg
Client-server stack for Web3! Turn your Raspberry Pi to a BAS server in minutes and enjoy the freedom of decentralized Web with a superior user experience!
Stars: ✭ 25 (-90.6%)
Mutual labels:  nas

Welcome to Archai

Archai is a platform for Neural Network Search (NAS) that allows you to generate efficient deep networks for your applications. Archai aspires to accelerate NAS research by enabling easy mix and match between different techniques while ensuring reproducibility, self-documented hyper-parameters and fair comparison. To achieve this, Archai uses a common code base that unifies several algorithms. Archai is extensible and modular to allow rapid experimentation of new research ideas and develop new NAS algorithms. Archai also hopes to make NAS research more accessible to non-experts by providing powerful configuration system and easy to use tools.

Extensive feature list

Installation

Prerequisites

Archai requires Python 3.6+ and PyTorch 1.2+. To install Python we highly recommend Anaconda. Archai works both on Linux as well as Windows.

Install from source code

We recommend installing from the source code:

git clone https://github.com/microsoft/archai.git
cd archai
install.sh # on Windows, use install.bat

For more information, please see Install guide

Quick Start

Running Algorithms

To run a specific NAS algorithm, specify it by --algos switch:

python scripts/main.py --algos darts --full

For more information on available switches and algorithms, please see running algorithms.

Tutorials

The best way to familiarize yourself with Archai is to take a quick tour through our 30 Minute tutorial.

We also have the tutorial for Petridish algorithm that was developed at Microsoft Research and now available through Archai.

Visual Studio Code

We highly recommend Visual Studio Code to take advantage of predefined run configurations and interactive debugging.

From the archai directory, launch Visual Studio Code. Select the Run button (Ctrl+Shift+D), chose the run configuration you want and click on Play icon.

Running experiments on Azure AML

To run NAS experiments at scale, you can use Archai on Azure.

Documentation

Docs and API reference is available for browsing and searching.

Contribute

We would love community contributions, feedback, questions, algorithm implementations and feature requests! Please file a Github issue or send us a pull request. Please review the Microsoft Code of Conduct and learn more.

Contact

Join the Archai group on Facebook to stay up to date or ask any questions.

Team

Archai has been created and maintained by Shital Shah and Debadeepta Dey in the Reinforcement Learning Group at Microsoft Research AI, Redmond, USA. Archai has benefited immensely from discussions with John Langford, Rich Caruana, Eric Horvitz and Alekh Agarwal.

We look forward to Archai becoming more community driven and including major contributors here.

Credits

Archai builds on several open source codebases. These includes: Fast AutoAugment, pt.darts, DARTS-PyTorch, DARTS, petridishnn, PyTorch CIFAR-10 Models, NVidia DeepLearning Examples, PyTorch Warmup Scheduler, NAS Evaluation is Frustratingly Hard, NASBench-PyTorch. Please see install_requires section in setup.py for up to date dependencies list. If you feel credit to any material is missing, please let us know by filing a Github issue.

License

This project is released under the MIT License. Please review the License file for more details.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].