All Projects → JiaminRen → Randwirenn

JiaminRen / Randwirenn

Pytorch Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Randwirenn

Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-62.22%)
Mutual labels:  imagenet, neural-architecture-search
regnet.pytorch
PyTorch-style and human-readable RegNet with a spectrum of pre-trained models
Stars: ✭ 50 (-81.48%)
Mutual labels:  imagenet, neural-architecture-search
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (+14.44%)
Mutual labels:  imagenet, neural-architecture-search
Dawn Bench Entries
DAWNBench: An End-to-End Deep Learning Benchmark and Competition
Stars: ✭ 254 (-5.93%)
Mutual labels:  deeplearning, imagenet
Switchable Normalization
Code for Switchable Normalization from "Differentiable Learning-to-Normalize via Switchable Normalization", https://arxiv.org/abs/1806.10779
Stars: ✭ 804 (+197.78%)
Mutual labels:  deeplearning, imagenet
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-58.52%)
Mutual labels:  imagenet, neural-architecture-search
Randwirenn
Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 675 (+150%)
Mutual labels:  imagenet, neural-architecture-search
Atomnas
Code for ICLR 2020 paper 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'
Stars: ✭ 197 (-27.04%)
Mutual labels:  imagenet, neural-architecture-search
Tf Mobilenet V2
Mobilenet V2(Inverted Residual) Implementation & Trained Weights Using Tensorflow
Stars: ✭ 85 (-68.52%)
Mutual labels:  deeplearning, imagenet
TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-75.56%)
Mutual labels:  imagenet, neural-architecture-search
recurrent-defocus-deblurring-synth-dual-pixel
Reference github repository for the paper "Learning to Reduce Defocus Blur by Realistically Modeling Dual-Pixel Data". We propose a procedure to generate realistic DP data synthetically. Our synthesis approach mimics the optical image formation found on DP sensors and can be applied to virtual scenes rendered with standard computer software. Lev…
Stars: ✭ 30 (-88.89%)
Mutual labels:  deeplearning
Interstellar
Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding. NeurIPS 2020.
Stars: ✭ 28 (-89.63%)
Mutual labels:  neural-architecture-search
NEATEST
NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training
Stars: ✭ 13 (-95.19%)
Mutual labels:  neural-architecture-search
Fixmatch Pytorch
Unofficial PyTorch implementation of "FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence"
Stars: ✭ 259 (-4.07%)
Mutual labels:  deeplearning
Similarity-Adaptive-Deep-Hashing
Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization (TPAMI2018)
Stars: ✭ 18 (-93.33%)
Mutual labels:  deeplearning
Data-Analysis
Different types of data analytics projects : EDA, PDA, DDA, TSA and much more.....
Stars: ✭ 22 (-91.85%)
Mutual labels:  deeplearning
Deep-Learning
It contains the coursework and the practice I have done while learning Deep Learning.🚀 👨‍💻💥 🚩🌈
Stars: ✭ 21 (-92.22%)
Mutual labels:  deeplearning
Multi-Face-Comparison
This repo is meant for backend API for face comparision and computer vision. It is built on python flask framework
Stars: ✭ 20 (-92.59%)
Mutual labels:  deeplearning
Archai
Reproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (-1.48%)
Mutual labels:  neural-architecture-search
deep sort
Deep Sort algorithm C++ version
Stars: ✭ 60 (-77.78%)
Mutual labels:  deeplearning

RandWireNN(Randomly Wired Neural Network)

PyTorch implementation of : Exploring Randomly Wired Neural Networks for Image Recognition.

Update

  • 2019/4/10: Release a result of regular computation(C=109) RandWird-WS(4,0.75). It has Top-1 accuracy of 77.07% on Imagenet dataset.
  • 2019/4/7: The code of RandWireNN are released.

Reproduced results

Model Paper's Top-1 Mine Top-1 Epochs LR Scheduler Weight Decay
RandWire-WS(4, 0.75), C=109 79% 77% * 100 cosine lr 5e-5
RandWire-WS(4, 0.75), C=78 74.7% 73.97% * 250 cosine lr 5e-5

*This result does not take advantage of dropout, droppath and label smoothing techniques. I will use these tricks to retrain the model.

Requirements

  • python packages
    • pytorch = 0.4.1
    • torchvision>=0.2.1
    • tensorboardX
    • pyyaml
    • CVdevKit
    • networkx

Data Preparation

Download the ImageNet dataset and put them into the {repo_root}/data/imagenet.

Training a model from scratch

./train.sh configs/config_regular_c109_n32.yaml

License

All materials in this repository are released under the Apache License 2.0.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].