All Projects → tyui592 → Pruning_filters_for_efficient_convnets

tyui592 / Pruning_filters_for_efficient_convnets

Licence: other
PyTorch implementation of "Pruning Filters For Efficient ConvNets"

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pruning filters for efficient convnets

Chainer Cifar10
Various CNN models for CIFAR10 with Chainer
Stars: ✭ 134 (+39.58%)
Mutual labels:  vgg, cifar10
RMNet
RM Operation can equivalently convert ResNet to VGG, which is better for pruning; and can help RepVGG perform better when the depth is large.
Stars: ✭ 129 (+34.38%)
Mutual labels:  vgg, pruning
Keras-CIFAR10
practice on CIFAR10 with Keras
Stars: ✭ 25 (-73.96%)
Mutual labels:  vgg, cifar10
Dynamic Model Pruning with Feedback
Implement of Dynamic Model Pruning with Feedback with pytorch
Stars: ✭ 25 (-73.96%)
Mutual labels:  pruning, pytorch-implementation
TailCalibX
Pytorch implementation of Feature Generation for Long-Tail Classification by Rahul Vigneswaran, Marc T Law, Vineeth N Balasubramaniam and Makarand Tapaswi
Stars: ✭ 32 (-66.67%)
Mutual labels:  pytorch-implementation
nvae
An unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (-13.54%)
Mutual labels:  pytorch-implementation
tfvaegan
[ECCV 2020] Official Pytorch implementation for "Latent Embedding Feedback and Discriminative Features for Zero-Shot Classification". SOTA results for ZSL and GZSL
Stars: ✭ 107 (+11.46%)
Mutual labels:  pytorch-implementation
FisherPruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
Stars: ✭ 127 (+32.29%)
Mutual labels:  pruning
SPAN
Semantics-guided Part Attention Network (ECCV 2020 Oral)
Stars: ✭ 19 (-80.21%)
Mutual labels:  pytorch-implementation
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-57.29%)
Mutual labels:  pruning
DocTr
The official code for “DocTr: Document Image Transformer for Geometric Unwarping and Illumination Correction”, ACM MM, Oral Paper, 2021.
Stars: ✭ 202 (+110.42%)
Mutual labels:  pytorch-implementation
depth-map-prediction
Pytorch Implementation of Depth Map Prediction from a Single Image using a Multi-Scale Deep Network
Stars: ✭ 78 (-18.75%)
Mutual labels:  pytorch-implementation
pcdarts-tf2
PC-DARTS (PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search, published in ICLR 2020) implemented in Tensorflow 2.0+. This is an unofficial implementation.
Stars: ✭ 25 (-73.96%)
Mutual labels:  cifar10
Deep-MVLM
A tool for precisely placing 3D landmarks on 3D facial scans based on the paper "Multi-view Consensus CNN for 3D Facial Landmark Placement"
Stars: ✭ 71 (-26.04%)
Mutual labels:  pytorch-implementation
AdaSpeech
AdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+12.5%)
Mutual labels:  pytorch-implementation
loc2vec
Pytorch implementation of the Loc2Vec with some modifications for speed
Stars: ✭ 40 (-58.33%)
Mutual labels:  pytorch-implementation
Printed-Chinese-Character-OCR
This is a Chinese Character ocr system based on Deep learning (VGG like CNN neural net work),this rep include trainning set generating,image preprocesing,NN model optimizing based on Keras high level NN framwork
Stars: ✭ 21 (-78.12%)
Mutual labels:  vgg
image-defect-detection-based-on-CNN
TensorBasicModel
Stars: ✭ 17 (-82.29%)
Mutual labels:  cifar10
DL.EyeSight
Mainly use SSD, YOLO and other models to solve the target detection problem in image and video !
Stars: ✭ 48 (-50%)
Mutual labels:  vgg
PyTorch
An open source deep learning platform that provides a seamless path from research prototyping to production deployment
Stars: ✭ 17 (-82.29%)
Mutual labels:  pytorch-implementation

Pruning Filters For Efficient ConvNets

Unofficial PyTorch implementation of pruning VGG on CIFAR-10 Data set

Reference: Pruning Filters For Efficient ConvNets, ICLR2017

Contact: Minseong Kim ([email protected])

Requirements

  • torch (version: 1.2.0)
  • torchvision (version: 0.4.0)
  • Pillow (version: 6.1.0)
  • matplotlib (version: 3.1.1)
  • numpy (version: 1.16.5)

Usage

Arguments

  • --train-flag: Train VGG on CIFAR Data set
  • --save-path: Path to save results, ex) trained_models/
  • --load-path: Path to load checkpoint, add 'checkpoint.pht' with save_path, ex) trained_models/checkpoint.pth
  • --resume-flag: Resume the training from checkpoint loaded with load-path
  • --prune-flag: Prune VGG
  • --prune-layers: List of target convolution layers for pruning, ex) conv1 conv2
  • --prune-channels: List of number of channels for pruning the prune-layers, ex) 4 14
  • --independent-prune-flag: Prune multiple layers by independent strategy
  • --retrain-flag: Retrain the pruned nework
  • --retrain-epoch: Number of epoch for retraining pruned network
  • --retrain-lr: Number of epoch for retraining pruned network

Example Scripts

Train VGG on CIFAR-10 Data set

python main.py --train-flag --data-set CIFAR10 --vgg vgg16_bn --save-path ./trained_models/

Prune VGG by 'greedy strategy'

python main.py --prune-flag --load-path ./trained_models/check_point.pth --save-path ./trained_models/pruning_reuslts/ --prune-layers conv1 conv2 --prune-channels 1 1 

Prune VGG by 'independent strategy'

python main.py --prune-flag --load-path ./trained_models/check_point.pth --save-path ./trained_models/pruning_reuslts/ --prune-layers conv1 conv2 --prune-channels 1 1 --independent-prune-flag

Retrain the pruned network

python main.py --prune-flag --load-path ./trained_models/check_point.pth --save-path ./trained_models/pruning_reuslts/ --prune-layers conv1 --prune-channels 1 --retrain-flag --retrain-epoch 20 --retrain-lr 0.001

Results

Absolute sum of filter weights for each layer of VGG-16 trained on CIFARA-10

figure1

Pruning filters with the lowest absolute weights sum and their corresponding test accuracies on CIFAR-10

figure2

Prune and retrain for each single layer of VGG-16 on CIFAR-10

figure3

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].