All Projects → chenxi116 → Pnasnet.tf

chenxi116 / Pnasnet.tf

Licence: apache-2.0
TensorFlow implementation of PNASNet-5 on ImageNet

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pnasnet.tf

Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (+9.8%)
Mutual labels:  automl, imagenet, neural-architecture-search
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+202.94%)
Mutual labels:  automl, imagenet, neural-architecture-search
Darts
Differentiable architecture search for convolutional and recurrent networks
Stars: ✭ 3,463 (+3295.1%)
Mutual labels:  automl, neural-architecture-search
Adanet
Fast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+3174.51%)
Mutual labels:  automl, neural-architecture-search
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+10388.24%)
Mutual labels:  automl, neural-architecture-search
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (+3035.29%)
Mutual labels:  automl, neural-architecture-search
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+3743.14%)
Mutual labels:  automl, neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+1063.73%)
Mutual labels:  automl, neural-architecture-search
TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-35.29%)
Mutual labels:  imagenet, neural-architecture-search
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+806.86%)
Mutual labels:  automl, neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+577.45%)
Mutual labels:  automl, neural-architecture-search
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+818.63%)
Mutual labels:  automl, neural-architecture-search
Randwirenn
Pytorch Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 270 (+164.71%)
Mutual labels:  imagenet, neural-architecture-search
regnet.pytorch
PyTorch-style and human-readable RegNet with a spectrum of pre-trained models
Stars: ✭ 50 (-50.98%)
Mutual labels:  imagenet, neural-architecture-search
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-71.57%)
Mutual labels:  automl, neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+347.06%)
Mutual labels:  automl, neural-architecture-search
Autokeras
AutoML library for deep learning
Stars: ✭ 8,269 (+8006.86%)
Mutual labels:  automl, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-74.51%)
Mutual labels:  automl, neural-architecture-search
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (+22.55%)
Mutual labels:  automl, neural-architecture-search
Randwirenn
Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (+561.76%)
Mutual labels:  imagenet, neural-architecture-search

PNASNet.TF

TensorFlow implementation of PNASNet-5. While completely compatible with the official implementation, this implementation focuses on simplicity and inference.

In particular, three files of 1200 lines in total (nasnet.py, nasnet_utils.py, pnasnet.py) are refactored into two files of 400 lines in total (cell.py, pnasnet.py). This code no longer supports NCHW data format, primarily because the released model was trained with NHWC. I tried to keep the rough structure and all functionalities of the official implementation when simplifying it.

If you use the code, please cite:

@inproceedings{liu2018progressive,
  author    = {Chenxi Liu and
               Barret Zoph and
               Maxim Neumann and
               Jonathon Shlens and
               Wei Hua and
               Li{-}Jia Li and
               Li Fei{-}Fei and
               Alan L. Yuille and
               Jonathan Huang and
               Kevin Murphy},
  title     = {Progressive Neural Architecture Search},
  booktitle = {European Conference on Computer Vision},
  year      = {2018}
}

Requirements

  • TensorFlow 1.8.0
  • torchvision 0.2.1 (for dataset loading)

Data and Model Preparation

  • Download the ImageNet validation set and move images to labeled subfolders. To do the latter, you can use this script. Make sure the folder val is under data/.
  • Download the PNASNet-5_Large_331 pretrained model:
cd data
wget https://storage.googleapis.com/download.tensorflow.org/models/pnasnet-5_large_2017_12_13.tar.gz
tar xvf pnasnet-5_large_2017_12_13.tar.gz

Usage

python main.py

The last printed line should read:

Test: [50000/50000]	[email protected] 0.829	[email protected] 0.962
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].