Regularization-Pruning[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-94.05%)
Mutual labels: pruning
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (-87.3%)
Mutual labels: pruning
Keras SurgeonPruning and other network surgery for trained Keras models.
Stars: ✭ 339 (-54.19%)
Mutual labels: pruning
jp-ocr-prunned-cnnAttempting feature map prunning on a CNN trained for Japanese OCR
Stars: ✭ 15 (-97.97%)
Mutual labels: pruning
SViTE[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-93.24%)
Mutual labels: pruning
Soft Filter PruningSoft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (-60.68%)
Mutual labels: pruning
batchnorm-pruningRethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124
Stars: ✭ 66 (-91.08%)
Mutual labels: pruning
Awesome EmdlEmbedded and mobile deep learning research resources
Stars: ✭ 554 (-25.14%)
Mutual labels: pruning
sparsifyEasy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (-81.35%)
Mutual labels: pruning
Filter Pruning Geometric MedianFilter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (-54.32%)
Mutual labels: pruning
nuxt-prune-html🔌⚡ Nuxt module to prune html before sending it to the browser (it removes elements matching CSS selector(s)), useful for boosting performance showing a different HTML for bots/audits by removing all the scripts with dynamic rendering
Stars: ✭ 69 (-90.68%)
Mutual labels: pruning
RMNetRM Operation can equivalently convert ResNet to VGG, which is better for pruning; and can help RepVGG perform better when the depth is large.
Stars: ✭ 129 (-82.57%)
Mutual labels: pruning
Deep-Compression.PytorchUnofficial Pytorch implementation of Deep Compression in CIFAR10
Stars: ✭ 29 (-96.08%)
Mutual labels: pruning
DistillerNeural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+408.11%)
Mutual labels: pruning
mmrazorOpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (-12.97%)
Mutual labels: pruning
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-96.89%)
Mutual labels: pruning
PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (-8.51%)
Mutual labels: pruning
AimetAIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (-38.78%)
Mutual labels: pruning