All Projects → ATMC → Similar Projects or Alternatives

201 Open source projects that are alternatives of or similar to ATMC

Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+321.95%)
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+304.88%)
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+207.32%)
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+1551.22%)
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+443.9%)
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+2904.88%)
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+2319.51%)
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (+397.56%)
Mutual labels:  pruning, model-compression
Nncf
PyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (+431.71%)
Mutual labels:  pruning, quantization
sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+543.9%)
Mutual labels:  pruning, quantization
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+1524.39%)
Mutual labels:  pruning, quantization
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+1585.37%)
Mutual labels:  quantization, model-compression
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (+7.32%)
Mutual labels:  pruning, model-compression
Aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (+1004.88%)
Mutual labels:  pruning, quantization
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (+724.39%)
Mutual labels:  pruning, model-compression
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (+43.9%)
Mutual labels:  quantization, model-compression
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (-12.2%)
Mutual labels:  quantization, model-compression
SSD-Pruning-and-quantization
Pruning and quantization for SSD. Model compression.
Stars: ✭ 19 (-53.66%)
Mutual labels:  pruning, quantization
Model compression
PyTorch Model Compression
Stars: ✭ 150 (+265.85%)
Mutual labels:  pruning, quantization
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+4858.54%)
Mutual labels:  quantization, model-compression
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (+163.41%)
Mutual labels:  quantization, model-compression
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (+175.61%)
Mutual labels:  quantization, model-compression
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (+21.95%)
Mutual labels:  pruning, model-compression
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (+36.59%)
Mutual labels:  pruning, quantization
sparsify
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (+236.59%)
Mutual labels:  pruning, quantization
Awesome Edge Machine Learning
A curated list of awesome edge machine learning resources, including research papers, inference engines, challenges, books, meetups and others.
Stars: ✭ 139 (+239.02%)
Mutual labels:  pruning, quantization
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (+370.73%)
Mutual labels:  pruning, model-compression
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+9070.73%)
Mutual labels:  pruning, quantization
Awesome Emdl
Embedded and mobile deep learning research resources
Stars: ✭ 554 (+1251.22%)
Mutual labels:  pruning, quantization
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (+609.76%)
Mutual labels:  pruning, model-compression
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+2380.49%)
Mutual labels:  pruning, model-compression
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (+41.46%)
Mutual labels:  pruning, quantization
fastT5
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+926.83%)
Mutual labels:  quantization
Neural-Network-Compression
Paper list for neural network compression techniques
Stars: ✭ 31 (-24.39%)
Mutual labels:  quantization
ViTs-vs-CNNs
[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (+253.66%)
Mutual labels:  robustness
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (+65.85%)
Mutual labels:  robustness
CIL-ReID
Benchmarks for Corruption Invariant Person Re-identification. [NeurIPS 2021 Track on Datasets and Benchmarks]
Stars: ✭ 71 (+73.17%)
Mutual labels:  robustness
aileen-core
Sensor data aggregation tool for any numerical sensor data. Robust and privacy-friendly.
Stars: ✭ 15 (-63.41%)
Mutual labels:  robustness
pre-training
Pre-Training Buys Better Robustness and Uncertainty Estimates (ICML 2019)
Stars: ✭ 90 (+119.51%)
Mutual labels:  robustness
robust-gcn
Implementation of the paper "Certifiable Robustness and Robust Training for Graph Convolutional Networks".
Stars: ✭ 35 (-14.63%)
Mutual labels:  robustness
PyTorch-Deep-Compression
A PyTorch implementation of the iterative pruning method described in Han et. al. (2015)
Stars: ✭ 39 (-4.88%)
Mutual labels:  pruning
Structured-Bayesian-Pruning-pytorch
pytorch implementation of Structured Bayesian Pruning
Stars: ✭ 18 (-56.1%)
Mutual labels:  model-compression
fasterai1
FasterAI: A repository for making smaller and faster models with the FastAI library.
Stars: ✭ 34 (-17.07%)
Mutual labels:  pruning
Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (-53.66%)
Mutual labels:  model-compression
recentrifuge
Recentrifuge: robust comparative analysis and contamination removal for metagenomics
Stars: ✭ 79 (+92.68%)
Mutual labels:  robustness
MQBench Quantize
QAT(quantize aware training) for classification with MQBench
Stars: ✭ 29 (-29.27%)
Mutual labels:  quantization
TF2DeepFloorplan
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
Stars: ✭ 98 (+139.02%)
Mutual labels:  quantization
deepvac
PyTorch Project Specification.
Stars: ✭ 507 (+1136.59%)
Mutual labels:  quantization
Selecsls Pytorch
Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
Stars: ✭ 251 (+512.2%)
Mutual labels:  pruning
adversarial-robustness-public
Code for AAAI 2018 accepted paper: "Improving the Adversarial Robustness and Interpretability of Deep Neural Networks by Regularizing their Input Gradients"
Stars: ✭ 49 (+19.51%)
Mutual labels:  robustness
Skimcaffe
Caffe for Sparse Convolutional Neural Network
Stars: ✭ 230 (+460.98%)
Mutual labels:  pruning
Stochastic-Quantization
Training Low-bits DNNs with Stochastic Quantization
Stars: ✭ 70 (+70.73%)
Mutual labels:  quantization
NeuralNetworkAnalysis.jl
Reachability analysis for closed-loop control systems
Stars: ✭ 37 (-9.76%)
Mutual labels:  robustness
local-search-quantization
State-of-the-art method for large-scale ANN search as of Oct 2016. Presented at ECCV 16.
Stars: ✭ 70 (+70.73%)
Mutual labels:  quantization
torchprune
A research library for pytorch-based neural network pruning, compression, and more.
Stars: ✭ 133 (+224.39%)
Mutual labels:  pruning
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+385.37%)
Mutual labels:  pruning
GAN-LTH
[ICLR 2021] "GANs Can Play Lottery Too" by Xuxi Chen, Zhenyu Zhang, Yongduo Sui, Tianlong Chen
Stars: ✭ 24 (-41.46%)
Mutual labels:  pruning
Mobile Yolov5 Pruning Distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
Stars: ✭ 192 (+368.29%)
Mutual labels:  pruning
pytorch-network-slimming
A package to make do Network Slimming a little easier
Stars: ✭ 40 (-2.44%)
Mutual labels:  pruning
aliyun-mns
阿里云MNS
Stars: ✭ 13 (-68.29%)
Mutual labels:  robustness
1-60 of 201 similar projects