All Categories → No Category → network-compression

Top 5 network-compression open source projects

Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
group sparsity
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
permute-quantize-finetune
Using ideas from product quantization for state-of-the-art neural network compression.
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
FisherPruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
1-5 of 5 network-compression projects