All Projects → gaosh → Structured-Bayesian-Pruning-pytorch

gaosh / Structured-Bayesian-Pruning-pytorch

Licence: other
pytorch implementation of Structured Bayesian Pruning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Structured-Bayesian-Pruning-pytorch

Pruning
Code for "Co-Evolutionary Compression for Unpaired Image Translation" (ICCV 2019) and "SCOP: Scientific Control for Reliable Neural Network Pruning" (NeurIPS 2020).
Stars: ✭ 159 (+783.33%)
Mutual labels:  model-compression
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+1138.89%)
Mutual labels:  model-compression
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (+277.78%)
Mutual labels:  bayesian-deep-learning
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+822.22%)
Mutual labels:  model-compression
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (+972.22%)
Mutual labels:  model-compression
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (+1033.33%)
Mutual labels:  model-compression
Amc Models
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 154 (+755.56%)
Mutual labels:  model-compression
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+600%)
Mutual labels:  model-compression
Bert Of Theseus
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
Stars: ✭ 209 (+1061.11%)
Mutual labels:  model-compression
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (+227.78%)
Mutual labels:  model-compression
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+861.11%)
Mutual labels:  model-compression
Jfasttext
Java interface for fastText
Stars: ✭ 193 (+972.22%)
Mutual labels:  model-compression
pytorch-convcnp
A PyTorch Implementation of Convolutional Conditional Neural Process.
Stars: ✭ 41 (+127.78%)
Mutual labels:  bayesian-deep-learning
Keras compressor
Model Compression CLI Tool for Keras.
Stars: ✭ 160 (+788.89%)
Mutual labels:  model-compression
Dropout BBalpha
Implementations of the ICML 2017 paper (with Yarin Gal)
Stars: ✭ 40 (+122.22%)
Mutual labels:  bayesian-deep-learning
Pytorch Weights pruning
PyTorch Implementation of Weights Pruning
Stars: ✭ 158 (+777.78%)
Mutual labels:  model-compression
Pocketflow
An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
Stars: ✭ 2,672 (+14744.44%)
Mutual labels:  model-compression
deep-weight-prior
The Deep Weight Prior, ICLR 2019
Stars: ✭ 42 (+133.33%)
Mutual labels:  bayesian-deep-learning
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (+100%)
Mutual labels:  model-compression
Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (+5.56%)
Mutual labels:  model-compression

Structured-Bayesian-Pruning-pytorch

pytorch implementation of Structured Bayesian Pruning, NIPS17. Authors of this paper provided TensorFlow implementation. This implementation is built on pytorch 0.4.

Some preliminary results on MNIST:

Network Method Error Neurons per Layer
LeNet-5 Orginal 0.68 20 - 50 - 800 - 500
SBP 0.86 3 - 18 - 284 - 283
SBP* 1.17 8 - 15 - 163 - 81

SBP* denotes the results from my implementation, I believe the results can be improved by hyperparameter tuning.

As a byproduct of my implementation, I roughly plot the graph of average layerwise sparsity vs. the performance of the model in MNIST. Average layerwise sparsity is not an accurate approximation for the compression rate, but you can get an idea how they related in Structred Bayesian Pruning.

The code only contains experiment to reproduce MNIST experiment, the file is LeNet_MNIST.py, however, it can be easily expanded to any other models or dataset. Here I give a simple example on how to custom your own model with Structured Bayesian Pruning.

from SBP_utils import SBP_layer
import torch.nn as nn
import torch

batch = 3
input_dim = 5 
output_dim = 10

#for CNN layer, input_dim is number of channels; for linear layer, input_dim is number of neurons
linear = nn.Linear(input_dim,output_dim)
sbp_layer = SBP_layer(output_dim)

#perform forward pass
x = torch.randn(batch, input_dim)
x = linear(x)
y, kl = sbp_layer(x)

#don't forget add kl to loss
loss = loss + kl
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].