All Projects → jack-willturner → deep-compression

jack-willturner / deep-compression

Licence: MIT license
Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to deep-compression

torchprune
A research library for pytorch-based neural network pruning, compression, and more.
Stars: ✭ 133 (-14.74%)
Mutual labels:  sparsity, pruning
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+326.92%)
Mutual labels:  sparsity, pruning
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+10.9%)
Mutual labels:  pruning
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+6.41%)
Mutual labels:  pruning
Hrank
Pytorch implementation of our CVPR 2020 (Oral) -- HRank: Filter Pruning using High-Rank Feature Map
Stars: ✭ 164 (+5.13%)
Mutual labels:  pruning
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (+23.72%)
Mutual labels:  pruning
strollr2d icassp2017
Image Denoising Codes using STROLLR learning, the Matlab implementation of the paper in ICASSP2017
Stars: ✭ 22 (-85.9%)
Mutual labels:  sparsity
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (+30.77%)
Mutual labels:  pruning
attention-guided-sparsity
Attention-Based Guided Structured Sparsity of Deep Neural Networks
Stars: ✭ 26 (-83.33%)
Mutual labels:  sparsity
Selecsls Pytorch
Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
Stars: ✭ 251 (+60.9%)
Mutual labels:  pruning
Skimcaffe
Caffe for Sparse Convolutional Neural Network
Stars: ✭ 230 (+47.44%)
Mutual labels:  pruning
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+27.56%)
Mutual labels:  pruning
NTFk.jl
Unsupervised Machine Learning: Nonnegative Tensor Factorization + k-means clustering
Stars: ✭ 36 (-76.92%)
Mutual labels:  sparsity
Mobile Yolov5 Pruning Distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
Stars: ✭ 192 (+23.08%)
Mutual labels:  pruning
PyTorch-Deep-Compression
A PyTorch implementation of the iterative pruning method described in Han et. al. (2015)
Stars: ✭ 39 (-75%)
Mutual labels:  pruning
pyowl
Ordered Weighted L1 regularization for classification and regression in Python
Stars: ✭ 52 (-66.67%)
Mutual labels:  sparsity
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+42.95%)
Mutual labels:  pruning
prunnable-layers-pytorch
Prunable nn layers for pytorch.
Stars: ✭ 47 (-69.87%)
Mutual labels:  pruning
Magni
A package for AFM image reconstruction and compressed sensing in general
Stars: ✭ 37 (-76.28%)
Mutual labels:  sparsity
SMSR
[CVPR 2021] Exploring Sparsity in Image Super-Resolution for Efficient Inference
Stars: ✭ 205 (+31.41%)
Mutual labels:  sparsity

Learning both Weights and Connections for Efficient Neural Networks

Total alerts Language grade: Python GitHub

A PyTorch implementation of this paper.

To run, try:

python train.py --model='resnet34' --checkpoint='resnet34'
python prune.py --model='resnet34' --checkpoint='resnet34'

Usage

The core principle behind the training/pruning/finetuning algorithms is as follows:

from models import get_model
from pruners import get_pruner 

model = get_model("resnet18")
pruner = get_pruner("L1Pruner", "unstructured")

for prune_rate in [10, 40, 60, 80]:
    pruner.prune(model, prune_rate)

We can choose between structured/unstructured pruning, as well as the pruning methods which are in pruners (at the time of writing we have support only for magnitude-based pruning and Fisher pruning).

Bring your own models

In order to add a new model family to the repository you basically just need to do two things:

  1. Swap out the convolutional layers to use the ConvBNReLU class
  2. Define a get_prunable_layers method which returns all the instances of ConvBNReLU which you want to be prunable

Summary

Given a family of ResNets, we can construct a Pareto frontier of the tradeoff between accuracy and number of parameters:

alt text

Han et al. posit that we can beat this Pareto frontier by leaving network structures fixed, but removing individual parameters:

alt text

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].