All Projects → ryujaehun → alexnet

ryujaehun / alexnet

Licence: MIT license
custom implementation alexnet with tensorflow

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to alexnet

alexnet-architecture.tensorflow
Unofficial TensorFlow implementation of "AlexNet" architecture.
Stars: ✭ 15 (-28.57%)
Mutual labels:  imagenet, alexnet
Selecsls Pytorch
Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
Stars: ✭ 251 (+1095.24%)
Mutual labels:  imagenet
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+11352.38%)
Mutual labels:  imagenet
Triplet Attention
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Stars: ✭ 222 (+957.14%)
Mutual labels:  imagenet
Pytorch Cpp
PyTorch C++ inference with LibTorch
Stars: ✭ 194 (+823.81%)
Mutual labels:  imagenet
Fusenet
Deep fusion project of deeply-fused nets, and the study on the connection to ensembling
Stars: ✭ 230 (+995.24%)
Mutual labels:  imagenet
Iresnet
Improved Residual Networks (https://arxiv.org/pdf/2004.04989.pdf)
Stars: ✭ 163 (+676.19%)
Mutual labels:  imagenet
alexnet-pytorch
Pytorch Implementation of AlexNet
Stars: ✭ 87 (+314.29%)
Mutual labels:  alexnet
Mobilenetv3 Pytorch
Implementing Searching for MobileNetV3 paper using Pytorch
Stars: ✭ 243 (+1057.14%)
Mutual labels:  imagenet
Moga
MoGA: Searching Beyond MobileNetV3
Stars: ✭ 220 (+947.62%)
Mutual labels:  imagenet
Mini Imagenet Tools
Tools for generating mini-ImageNet dataset and processing batches
Stars: ✭ 209 (+895.24%)
Mutual labels:  imagenet
Atomnas
Code for ICLR 2020 paper 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'
Stars: ✭ 197 (+838.1%)
Mutual labels:  imagenet
Pyconv
Pyramidal Convolution: Rethinking Convolutional Neural Networks for Visual Recognition (https://arxiv.org/pdf/2006.11538.pdf)
Stars: ✭ 231 (+1000%)
Mutual labels:  imagenet
Torchdistill
PyTorch-based modular, configuration-driven framework for knowledge distillation. 🏆18 methods including SOTA are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy.
Stars: ✭ 177 (+742.86%)
Mutual labels:  imagenet
Dawn Bench Entries
DAWNBench: An End-to-End Deep Learning Benchmark and Competition
Stars: ✭ 254 (+1109.52%)
Mutual labels:  imagenet
Senet Caffe
A Caffe Re-Implementation of SENet
Stars: ✭ 169 (+704.76%)
Mutual labels:  imagenet
Labelimg
🖍️ LabelImg is a graphical image annotation tool and label object bounding boxes in images
Stars: ✭ 16,088 (+76509.52%)
Mutual labels:  imagenet
Octconv.pytorch
PyTorch implementation of Octave Convolution with pre-trained Oct-ResNet and Oct-MobileNet models
Stars: ✭ 229 (+990.48%)
Mutual labels:  imagenet
nested-transformer
Nested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (+728.57%)
Mutual labels:  imagenet
EffcientNetV2
EfficientNetV2 implementation using PyTorch
Stars: ✭ 94 (+347.62%)
Mutual labels:  imagenet

alexnet


about

AlexNet is the name of a convolutional neural network, originally written with CUDA to run with GPU support, which competed in the ImageNet Large Scale Visual Recognition Challenge in 2012. The network achieved a top-5 error of 15.3%, more than 10.8 percentage points ahead of the runner up. AlexNet was designed by the SuperVision group, consisting of Alex Krizhevsky, Geoffrey Hinton, and Ilya Sutskever. -wikipedia

architecture

The neural network, which has 60 million parameters and 650,000 neurons, consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax. To make training faster, we used non-saturating neurons and a very efficient GPU implementation of the convolution operation. To reduce overfitting in the fully-connected layers we employed a recently-developed regularization method called “dropout” that proved to be very effective.

batch normaliztion

batch normaliztionis decreasing technical skill,Gradient Vanishing & Gradient Exploding

k=2,n=5,α=10−4,β=0.75k=2,n=5,α=10−4,β=0.75

optimizer

Apply AdamOptimizer

requirement

  • tensorflow-gpu (ver.1.3.1)
  • cv2 (ver.3.3.0)
  • numpy (ver 1.13.3)
  • scipy (ver 0.19.1)

Usage

  1. Download the image file from the link below.(LSVRC2012 train,val,test,Development kit (Task 1))
  2. untar.(There is a script in etc)
  3. Modify IMAGENET_PATH in train.py hyperparameter(maybe you need).

train


From the beginning

python3 train.py

resume training

python3 train.py -resume

test

python3 test.py

Classify

python classify.py image

tensorboard

tensorboard --logdir path/to/summary/train/

TODO

  • ~~apply another optimizer ~~
  • ~~apply tensorboard ~~
  • Fit to a GPU
  • Application of the technique to the paper
  • Eliminate bottlenecks

file_architecture

ILSVRC 2012 training set folder should be srtuctured like this:
		ILSVRC2012_img_train
			|_n01440764
			|_n01443537
			|_n01484850
			|_n01491361
			|_ ...

you must untar training file untar.sh

download

download LSVRC 2012 image data file

Remove log

If you do not want to see the log at startup train.py line 97, remove allow_soft_placement=True, log_device_placement=True

references

optimizer

AlexNet training on ImageNet LSVRC 2012

Tensorflow Models

Tensorflow API

Licence

MIT Licence

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].