All Projects → INCHEON-CHO → Dynamic_Model_Pruning_with_Feedback

INCHEON-CHO / Dynamic_Model_Pruning_with_Feedback

Licence: other
Implement of Dynamic Model Pruning with Feedback with pytorch

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Dynamic Model Pruning with Feedback

SSD-Pruning-and-quantization
Pruning and quantization for SSD. Model compression.
Stars: ✭ 19 (-24%)
Mutual labels:  pruning, prune
nuxt-prune-html
🔌⚡ Nuxt module to prune html before sending it to the browser (it removes elements matching CSS selector(s)), useful for boosting performance showing a different HTML for bots/audits by removing all the scripts with dynamic rendering
Stars: ✭ 69 (+176%)
Mutual labels:  pruning, prune
Pruning filters for efficient convnets
PyTorch implementation of "Pruning Filters For Efficient ConvNets"
Stars: ✭ 96 (+284%)
Mutual labels:  pruning, pytorch-implementation
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+14940%)
Mutual labels:  pruning, pruning-structures
ViNet
ViNet Pushing the limits of Visual Modality for Audio Visual Saliency Prediction
Stars: ✭ 36 (+44%)
Mutual labels:  pytorch-implementation
cosine-ood-detector
Hyperparameter-Free Out-of-Distribution Detection Using Softmax of Scaled Cosine Similarity
Stars: ✭ 30 (+20%)
Mutual labels:  pytorch-implementation
RandLA-Net-pytorch
🍀 Pytorch Implementation of RandLA-Net (https://arxiv.org/abs/1911.11236)
Stars: ✭ 69 (+176%)
Mutual labels:  pytorch-implementation
Text-Classification-LSTMs-PyTorch
The aim of this repository is to show a baseline model for text classification by implementing a LSTM-based model coded in PyTorch. In order to provide a better understanding of the model, it will be used a Tweets dataset provided by Kaggle.
Stars: ✭ 45 (+80%)
Mutual labels:  pytorch-implementation
DocuNet
Code and dataset for the IJCAI 2021 paper "Document-level Relation Extraction as Semantic Segmentation".
Stars: ✭ 84 (+236%)
Mutual labels:  pytorch-implementation
ActiveSparseShifts-PyTorch
Implementation of Sparse Shift Layer and Active Shift Layer (3D, 4D, 5D tensors) for PyTorch(CPU,GPU)
Stars: ✭ 27 (+8%)
Mutual labels:  pytorch-implementation
GAN-LTH
[ICLR 2021] "GANs Can Play Lottery Too" by Xuxi Chen, Zhenyu Zhang, Yongduo Sui, Tianlong Chen
Stars: ✭ 24 (-4%)
Mutual labels:  pruning
deep-blueberry
If you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-32%)
Mutual labels:  pytorch-implementation
Awesome-Pytorch-Tutorials
Awesome Pytorch Tutorials
Stars: ✭ 23 (-8%)
Mutual labels:  pytorch-implementation
semi-supervised-paper-implementation
Reproduce some methods in semi-supervised papers.
Stars: ✭ 35 (+40%)
Mutual labels:  pytorch-implementation
docker-system-prune
Docker system prune automatically
Stars: ✭ 20 (-20%)
Mutual labels:  prune
MobileHumanPose
This repo is official PyTorch implementation of MobileHumanPose: Toward real-time 3D human pose estimation in mobile devices(CVPRW 2021).
Stars: ✭ 206 (+724%)
Mutual labels:  pytorch-implementation
deep-compression
Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626
Stars: ✭ 156 (+524%)
Mutual labels:  pruning
Deep-Learning-Pytorch
A repo containing code covering various aspects of deep learning on Pytorch. Great for beginners and intermediate in the field
Stars: ✭ 59 (+136%)
Mutual labels:  pytorch-implementation
prunnable-layers-pytorch
Prunable nn layers for pytorch.
Stars: ✭ 47 (+88%)
Mutual labels:  pruning
Generative MLZSL
[TPAMI Under Submission] Generative Multi-Label Zero-Shot Learning
Stars: ✭ 37 (+48%)
Mutual labels:  pytorch-implementation

Dynamic Model Pruning with Feedback

Paper Link : Dynamic Model Pruning with Feedback - ICLR2020

It's UNOFFICIAL code!

If you want to get information of hyperparameters, you should read appendix part of this paper

Abstract

(1) Allowing dynamic allocation of the sparsity pattern

(2) Incorporating feedback signal to reactivate prematurely pruned weights

Method

Alt text

Alt text

Run

python main.py cifar10 --datapath DATAPATH --a resnet layers 56 -C -g 0 save train.pth \
--epochs 300 --batch-size 128  --lr 0.2 --wd 1e-4 --nesterov --scheduler multistep --milestones 150 225 --gamma 0.1

Experiment

Best Top-1 Acc(%) Sparsity(%)
Basline 93.97 0
DPF 93.73 90.00

Experiment on ResNet56 for CIFAR10

DPF run :

python main.py cifar10 --datapath DATAPATH -a resnet --layers 56 -C -g 0 --save prune.pth \
-P --prune-type unstructured --prune-freq 16 --prune-rate 0.9 --prune-imp L2 \
--epochs 300 --batch-size 128  --lr 0.2 --wd 1e-4 --nesterov --scheduler multistep --milestones 150 225 --gamma 0.1
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].