Awesome Ml Model CompressionAwesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (-83.27%)
Mutual labels: quantization, model-compression, pruning
AimetAIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (-54.33%)
Mutual labels: quantization, pruning, compression
SSD-Pruning-and-quantizationPruning and quantization for SSD. Model compression.
Stars: ✭ 19 (-98.08%)
Mutual labels: compression, pruning, quantization
ATMC[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-95.87%)
Mutual labels: pruning, quantization, model-compression
Awesome Ai InfrastructuresInfrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (-77.52%)
Mutual labels: quantization, model-compression, pruning
Kd libA Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (-82.56%)
Mutual labels: quantization, model-compression, pruning
Micronetmicronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+24.19%)
Mutual labels: quantization, model-compression, pruning
NncfPyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (-78.02%)
Mutual labels: quantization, pruning, compression
torch-model-compression针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-87.3%)
Mutual labels: pruning, quantization, model-compression
PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (-31.75%)
Mutual labels: quantization, model-compression, pruning
libcaesiumThe Caesium compression library written in Rust
Stars: ✭ 58 (-94.15%)
Mutual labels: compression, optimization
image-optimizerSmart image optimization
Stars: ✭ 15 (-98.49%)
Mutual labels: compression, optimization
SViTE[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-94.96%)
Mutual labels: pruning, model-compression
ImagerAutomated image compression for efficiently distributing images on the web.
Stars: ✭ 266 (-73.19%)
Mutual labels: optimization, compression
sparsifyEasy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (-86.09%)
Mutual labels: pruning, quantization
nuxt-prune-html🔌⚡ Nuxt module to prune html before sending it to the browser (it removes elements matching CSS selector(s)), useful for boosting performance showing a different HTML for bots/audits by removing all the scripts with dynamic rendering
Stars: ✭ 69 (-93.04%)
Mutual labels: optimization, pruning
Regularization-Pruning[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-95.56%)
Mutual labels: pruning, model-compression
Soft Filter PruningSoft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (-70.67%)
Mutual labels: model-compression, pruning
Filter Pruning Geometric MedianFilter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (-65.93%)
Mutual labels: model-compression, pruning
DistillerNeural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+279.03%)
Mutual labels: quantization, pruning