he-y / Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017
Projects that are alternatives of or similar to Awesome Pruning
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (-82.99%)
Mutual labels: model-compression, pruning
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (-2.46%)
Mutual labels: model-compression, pruning
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (-81.02%)
Mutual labels: model-compression, pruning
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (-78.07%)
Mutual labels: model-compression, pruning
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-95.67%)
Mutual labels: pruning, model-compression
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (-83.68%)
Mutual labels: model-compression, pruning
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (-66.76%)
Mutual labels: model-compression, pruning
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+21.14%)
Mutual labels: model-compression, pruning
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-95.97%)
Mutual labels: pruning, model-compression
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-87.61%)
Mutual labels: pruning, model-compression
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (-79.94%)
Mutual labels: pruning, model-compression
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (-71.39%)
Mutual labels: model-compression, pruning
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-95.08%)
Mutual labels: pruning, model-compression
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (-33.43%)
Mutual labels: model-compression, pruning
Knowledge Distillation Papers
knowledge distillation papers
Stars: ✭ 422 (-58.51%)
Mutual labels: model-compression
Pytorch Pruning
PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
Stars: ✭ 740 (-27.24%)
Mutual labels: pruning
Data Efficient Model Compression
Data Efficient Model Compression
Stars: ✭ 380 (-62.64%)
Mutual labels: model-compression
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+269.71%)
Mutual labels: pruning
Channel Pruning
Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)
Stars: ✭ 979 (-3.74%)
Mutual labels: model-compression
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (-32.06%)
Mutual labels: model-compression
Awesome Pruning
A curated list of neural network pruning and related resources. Inspired by awesome-deep-vision, awesome-adversarial-machine-learning, awesome-deep-learning-papers and Awesome-NAS.
Please feel free to pull requests or open an issue to add papers.
Table of Contents
Type of Pruning
Type | F |
W |
Other |
---|---|---|---|
Explanation | Filter pruning | Weight pruning | other types |
2020
2019
2018
2017
2016
Title | Venue | Type | Code |
---|---|---|---|
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding | ICLR (Best) | W |
Caffe(Author) |
Dynamic Network Surgery for Efficient DNNs | NeurIPS | W |
Caffe(Author) |
2015
Title | Venue | Type | Code |
---|---|---|---|
Learning both Weights and Connections for Efficient Neural Networks | NeurIPS | W |
PyTorch(3rd) |
Related Repo
Awesome-model-compression-and-acceleration
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].