All Projects → Model_compression → Similar Projects or Alternatives

126 Open source projects that are alternatives of or similar to Model_compression

Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+561.33%)
Mutual labels:  quantization, pruning
Aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (+202%)
Mutual labels:  quantization, pruning
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+351.33%)
Mutual labels:  quantization, pruning
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (-61.33%)
Mutual labels:  quantization, pruning
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+10.67%)
Mutual labels:  quantization, pruning
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+48.67%)
Mutual labels:  quantization, pruning
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+721.33%)
Mutual labels:  quantization, pruning
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-62.67%)
Mutual labels:  pruning, quantization
sparsify
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (-8%)
Mutual labels:  pruning, quantization
Nncf
PyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (+45.33%)
Mutual labels:  quantization, pruning
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-16%)
Mutual labels:  pruning, quantization
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+344%)
Mutual labels:  pruning, quantization
sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+76%)
Mutual labels:  pruning, quantization
Awesome Edge Machine Learning
A curated list of awesome edge machine learning resources, including research papers, inference engines, challenges, books, meetups and others.
Stars: ✭ 139 (-7.33%)
Mutual labels:  quantization, pruning
Awesome Emdl
Embedded and mobile deep learning research resources
Stars: ✭ 554 (+269.33%)
Mutual labels:  quantization, pruning
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+2406.67%)
Mutual labels:  quantization, pruning
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+15.33%)
Mutual labels:  quantization, pruning
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-72.67%)
Mutual labels:  pruning, quantization
SSD-Pruning-and-quantization
Pruning and quantization for SSD. Model compression.
Stars: ✭ 19 (-87.33%)
Mutual labels:  pruning, quantization
Libimagequant Rust
libimagequant (pngquant) bindings for the Rust language
Stars: ✭ 17 (-88.67%)
Mutual labels:  quantization
Pytorch Pruning
PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
Stars: ✭ 740 (+393.33%)
Mutual labels:  pruning
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+360.67%)
Mutual labels:  quantization
Pinto model zoo
A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Stars: ✭ 634 (+322.67%)
Mutual labels:  quantization
Model Quantization
Collections of model quantization algorithms
Stars: ✭ 118 (-21.33%)
Mutual labels:  quantization
Vectorsinsearch
Dice.com repo to accompany the dice.com 'Vectors in Search' talk by Simon Hughes, from the Activate 2018 search conference, and the 'Searching with Vectors' talk from Haystack 2019 (US). Builds upon my conceptual search and semantic search work from 2015
Stars: ✭ 71 (-52.67%)
Mutual labels:  quantization
Paddleclas
A treasure chest for image classification powered by PaddlePaddle
Stars: ✭ 625 (+316.67%)
Mutual labels:  quantization
Tf Keras Surgeon
Pruning and other network surgery for trained TF.Keras models.
Stars: ✭ 25 (-83.33%)
Mutual labels:  pruning
Pyepr
Powerful, automated analysis and design of quantum microwave chips & devices [Energy-Participation Ratio and more]
Stars: ✭ 81 (-46%)
Mutual labels:  quantization
Dfq
PyTorch implementation of Data Free Quantization Through Weight Equalization and Bias Correction.
Stars: ✭ 125 (-16.67%)
Mutual labels:  quantization
Deephash
An Open-Source Package for Deep Learning to Hash (DeepHash)
Stars: ✭ 417 (+178%)
Mutual labels:  quantization
Model Compression And Acceleration Progress
Repository to track the progress in model compression and acceleration
Stars: ✭ 63 (-58%)
Mutual labels:  pruning
Brevitas
Brevitas: quantization-aware training in PyTorch
Stars: ✭ 343 (+128.67%)
Mutual labels:  quantization
Adventures In Tensorflow Lite
This repository contains notebooks that show the usage of TensorFlow Lite for quantizing deep neural networks.
Stars: ✭ 79 (-47.33%)
Mutual labels:  pruning
Ctranslate2
Fast inference engine for OpenNMT models
Stars: ✭ 140 (-6.67%)
Mutual labels:  quantization
Libimagequant
Palette quantization library that powers pngquant and other PNG optimizers
Stars: ✭ 344 (+129.33%)
Mutual labels:  quantization
Dsq
pytorch implementation of "Differentiable Soft Quantization: Bridging Full-Precision and Low-Bit Neural Networks"
Stars: ✭ 70 (-53.33%)
Mutual labels:  quantization
Pngquant
Lossy PNG compressor — pngquant command based on libimagequant library
Stars: ✭ 4,086 (+2624%)
Mutual labels:  quantization
Cen
[NeurIPS 2020] Code release for paper "Deep Multimodal Fusion by Channel Exchanging" (In PyTorch)
Stars: ✭ 112 (-25.33%)
Mutual labels:  pruning
Grasp
Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH
Stars: ✭ 58 (-61.33%)
Mutual labels:  pruning
Inq Pytorch
A PyTorch implementation of "Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights"
Stars: ✭ 147 (-2%)
Mutual labels:  quantization
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (-24.67%)
Mutual labels:  quantization
Keras Surgeon
Pruning and other network surgery for trained Keras models.
Stars: ✭ 339 (+126%)
Mutual labels:  pruning
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (+125.33%)
Mutual labels:  pruning
Yolov3 Network Slimming
yolov3 network slimming剪枝的一种实现
Stars: ✭ 320 (+113.33%)
Mutual labels:  pruning
Jacinto Ai Devkit
Training & Quantization of embedded friendly Deep Learning / Machine Learning / Computer Vision models
Stars: ✭ 49 (-67.33%)
Mutual labels:  quantization
Deephash Papers
Must-read papers on deep learning to hash (DeepHash)
Stars: ✭ 302 (+101.33%)
Mutual labels:  quantization
Yolov3v4 Modelcompression Multidatasettraining Multibackbone
YOLO ModelCompression MultidatasetTraining
Stars: ✭ 287 (+91.33%)
Mutual labels:  pruning
Filter Grafting
Filter Grafting for Deep Neural Networks(CVPR 2020)
Stars: ✭ 110 (-26.67%)
Mutual labels:  pruning
Delve
PyTorch and Keras model training and layer saturation monitor
Stars: ✭ 49 (-67.33%)
Mutual labels:  pruning
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (+94%)
Mutual labels:  pruning
Finn
Dataflow compiler for QNN inference on FPGAs
Stars: ✭ 284 (+89.33%)
Mutual labels:  quantization
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+578%)
Mutual labels:  pruning
Qkeras
QKeras: a quantization deep learning library for Tensorflow Keras
Stars: ✭ 254 (+69.33%)
Mutual labels:  quantization
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-84.67%)
Mutual labels:  pruning
Graffitist
Graph Transforms to Quantize and Retrain Deep Neural Nets in TensorFlow
Stars: ✭ 135 (-10%)
Mutual labels:  quantization
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (-28%)
Mutual labels:  quantization
Quantization.mxnet
Simulate quantization and quantization aware training for MXNet-Gluon models.
Stars: ✭ 42 (-72%)
Mutual labels:  quantization
TextPruner
A PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (-37.33%)
Mutual labels:  pruning
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-66.67%)
Mutual labels:  pruning
quantize
🎨 Simple color palette quantization using MMCQ
Stars: ✭ 24 (-84%)
Mutual labels:  quantization
1-60 of 126 similar projects