All Projects → jaxony → Shufflenet

jaxony / Shufflenet

Licence: mit
ShuffleNet in PyTorch. Based on https://arxiv.org/abs/1707.01083

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Shufflenet

minirocket
MINIROCKET: A Very Fast (Almost) Deterministic Transform for Time Series Classification
Stars: ✭ 166 (-36.64%)
Mutual labels:  convolution
Fakenewscorpus
A dataset of millions of news articles scraped from a curated list of data sources.
Stars: ✭ 255 (-2.67%)
Mutual labels:  artificial-intelligence
Atlas
An Open Source, Self-Hosted Platform For Applied Deep Learning Development
Stars: ✭ 259 (-1.15%)
Mutual labels:  artificial-intelligence
dsp-theory
Theory of digital signal processing (DSP): signals, filtration (IIR, FIR, CIC, MAF), transforms (FFT, DFT, Hilbert, Z-transform) etc.
Stars: ✭ 643 (+145.42%)
Mutual labels:  convolution
NonuniformBlur
Authors' implementation of my SIGGRAPH Asia 2019 Technical Briefs (The Power of Box Filters: Real-time Approximation to Large Convolution Kernel by Box-filtered Image Pyramid) demo I (just for reference). A very fast approximation to large-kernel Gaussian blur with nonuniform blur radii, by making use of box-filtered mip maps V-cycle (theoratica…
Stars: ✭ 83 (-68.32%)
Mutual labels:  convolution
Ai Job Notes
AI算法岗求职攻略(涵盖准备攻略、刷题指南、内推和AI公司清单等资料)
Stars: ✭ 3,191 (+1117.94%)
Mutual labels:  artificial-intelligence
spark-convolution-patch
Convolution and other super-patches (blur, sharpen)
Stars: ✭ 74 (-71.76%)
Mutual labels:  convolution
Deeplearningnotes
《深度学习》花书手推笔记
Stars: ✭ 257 (-1.91%)
Mutual labels:  artificial-intelligence
Articutapi
API of Articut 中文斷詞 (兼具語意詞性標記):「斷詞」又稱「分詞」,是中文資訊處理的基礎。Articut 不用機器學習,不需資料模型,只用現代白話中文語法規則,即能達到 SIGHAN 2005 F1-measure 94% 以上,Recall 96% 以上的成績。
Stars: ✭ 252 (-3.82%)
Mutual labels:  artificial-intelligence
Machine Learning And Ai In Trading
Applying Machine Learning and AI Algorithms applied to Trading for better performance and low Std.
Stars: ✭ 258 (-1.53%)
Mutual labels:  artificial-intelligence
cnn np
使用numpy构建cnn复习深度学习知识
Stars: ✭ 33 (-87.4%)
Mutual labels:  convolution
shellnet
ShellNet: Efficient Point Cloud Convolutional Neural Networks using Concentric Shells Statistics
Stars: ✭ 80 (-69.47%)
Mutual labels:  convolution
Iamdinosaur
🦄 An Artificial Inteligence to teach Google's Dinosaur to jump cactus
Stars: ✭ 2,767 (+956.11%)
Mutual labels:  artificial-intelligence
deep-scite
🚣 A simple recommendation engine (by way of convolutions and embeddings) written in TensorFlow
Stars: ✭ 20 (-92.37%)
Mutual labels:  convolution
Polyaxon
Machine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+1032.06%)
Mutual labels:  artificial-intelligence
WebSight
Aiding the visually impaired through real time augmented reality, AI object detection, WebGL shaders effects such as edge detection, and colour adjustments.
Stars: ✭ 26 (-90.08%)
Mutual labels:  convolution
Amazing Python Scripts
🚀 Curated collection of Amazing Python scripts from Basics to Advance with automation task scripts.
Stars: ✭ 229 (-12.6%)
Mutual labels:  artificial-intelligence
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (+0%)
Mutual labels:  artificial-intelligence
Dalle Mtf
Open-AI's DALL-E for large scale training in mesh-tensorflow.
Stars: ✭ 250 (-4.58%)
Mutual labels:  artificial-intelligence
Es Dev Stack
An on-premises, bare-metal solution for deploying GPU-powered applications in containers
Stars: ✭ 257 (-1.91%)
Mutual labels:  artificial-intelligence

ShuffleNet in PyTorch

An implementation of ShuffleNet in PyTorch. ShuffleNet is an efficient convolutional neural network architecture for mobile devices. According to the paper, it outperforms Google's MobileNet by a small percentage.

What is ShuffleNet?

In one sentence, ShuffleNet is a ResNet-like model that uses residual blocks (called ShuffleUnits), with the main innovation being the use of pointwise, or 1x1, group convolutions as opposed to normal pointwise convolutions.

Usage

Clone the repo:

git clone https://github.com/jaxony/ShuffleNet.git

Use the model defined in model.py:

from model import ShuffleNet

# running on MNIST
net = ShuffleNet(num_classes=10, in_channels=1)

Performance

Trained on ImageNet (using the PyTorch ImageNet example) with groups=3 and no channel multiplier. On the test set, got 62.2% top 1 and 84.2% top 5. Unfortunately, this isn't comparable to Table 5 of the paper, because they don't run a network with these settings, but it is somewhere between the network with groups=3 and half the number of channels (42.8% top 1) and the network with the same number of channels but groups=8 (32.4% top 1). The pretrained state dictionary can be found here, in the following format:

{
    'epoch': epoch + 1,
    'arch': args.arch,
    'state_dict': model.state_dict(),
    'best_prec1': best_prec1,
    'optimizer' : optimizer.state_dict()
}

Note: trained with the default ImageNet settings, which are actually different from the training regime described in the paper. Pending running again with those settings (and groups=8).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].