All Projects → huawei-noah → Pruning

huawei-noah / Pruning

Licence: bsd-3-clause
Code for "Co-Evolutionary Compression for Unpaired Image Translation" (ICCV 2019) and "SCOP: Scientific Control for Reliable Neural Network Pruning" (NeurIPS 2020).

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pruning

Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+539.62%)
Mutual labels:  model-compression
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (-32.08%)
Mutual labels:  model-compression
Collaborative Distillation
PyTorch code for our CVPR'20 paper "Collaborative Distillation for Ultra-Resolution Universal Style Transfer"
Stars: ✭ 138 (-13.21%)
Mutual labels:  model-compression
Keras model compression
Model Compression Based on Geoffery Hinton's Logit Regression Method in Keras applied to MNIST 16x compression over 0.95 percent accuracy.An Implementation of "Distilling the Knowledge in a Neural Network - Geoffery Hinton et. al"
Stars: ✭ 59 (-62.89%)
Mutual labels:  model-compression
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+6628.3%)
Mutual labels:  model-compression
Awesome Model Compression
papers about model compression
Stars: ✭ 119 (-25.16%)
Mutual labels:  model-compression
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+523.9%)
Mutual labels:  model-compression
Amc Models
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 154 (-3.14%)
Mutual labels:  model-compression
Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+996.86%)
Mutual labels:  model-compression
Condensa
Programmable Neural Network Compression
Stars: ✭ 129 (-18.87%)
Mutual labels:  model-compression
Aquvitae
The Easiest Knowledge Distillation Library for Lightweight Deep Learning
Stars: ✭ 71 (-55.35%)
Mutual labels:  model-compression
Neuronblocks
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Stars: ✭ 1,356 (+752.83%)
Mutual labels:  model-compression
Microexpnet
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Stars: ✭ 121 (-23.9%)
Mutual labels:  model-compression
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Stars: ✭ 1,031 (+548.43%)
Mutual labels:  model-compression
Yolov3
yolov3 by pytorch
Stars: ✭ 142 (-10.69%)
Mutual labels:  model-compression
Compress
Compressing Representations for Self-Supervised Learning
Stars: ✭ 43 (-72.96%)
Mutual labels:  model-compression
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (-28.93%)
Mutual labels:  model-compression
Pytorch Weights pruning
PyTorch Implementation of Weights Pruning
Stars: ✭ 158 (-0.63%)
Mutual labels:  model-compression
Ld Net
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
Stars: ✭ 148 (-6.92%)
Mutual labels:  model-compression
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+1178.62%)
Mutual labels:  model-compression

GAN-pruning

A Pytorch implementation for our ICCV 2019 paper, Co-Evolutionary Compression for unpaired image Translation, which proposes a co-evolutionary approach for reducing memory usage and FLOPs of generators on image-to-image transfer task simultaneously while maintains their performances.

Performance

Performance on cityscapes compared with conventional pruning method:

SCOP

A Pytorch implementation for our NeurIPS 2020 paper, SCOP: Scientific Control for Reliable Neural Network Pruning, which proposes a reliable neural network pruning algorithm by setting up a scientific control.

Performance

Comparison of the pruned networks with different methods on ImageNet.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].