All Projects → huawei-noah → Data Efficient Model Compression

huawei-noah / Data Efficient Model Compression

Licence: bsd-3-clause
Data Efficient Model Compression

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Data Efficient Model Compression

Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (-95%)
Mutual labels:  model-compression
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-88.42%)
Mutual labels:  model-compression
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-86.84%)
Mutual labels:  model-compression
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (-90.53%)
Mutual labels:  model-compression
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-89.21%)
Mutual labels:  model-compression
Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
Stars: ✭ 84 (-77.89%)
Mutual labels:  model-compression
Pocketflow
An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
Stars: ✭ 2,672 (+603.16%)
Mutual labels:  model-compression
Amc
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 298 (-21.58%)
Mutual labels:  model-compression
ESNAC
Learnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-92.89%)
Mutual labels:  model-compression
A- Guide -to Data Sciecne from mathematics
It is a blueprint to data science from the mathematics to algorithms. It is not completed.
Stars: ✭ 25 (-93.42%)
Mutual labels:  model-compression
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-66.84%)
Mutual labels:  model-compression
FastPose
pytorch realtime multi person keypoint estimation
Stars: ✭ 36 (-90.53%)
Mutual labels:  model-compression
Efficient-Computing
Efficient-Computing
Stars: ✭ 474 (+24.74%)
Mutual labels:  model-compression
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (-84.47%)
Mutual labels:  model-compression
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (-23.42%)
Mutual labels:  model-compression
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (-46.32%)
Mutual labels:  model-compression
allie
🤖 A machine learning framework for audio, text, image, video, or .CSV files (50+ featurizers and 15+ model trainers).
Stars: ✭ 93 (-75.53%)
Mutual labels:  model-compression
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (-11.05%)
Mutual labels:  model-compression
Model Compression Papers
Papers for deep neural network compression and acceleration
Stars: ✭ 296 (-22.11%)
Mutual labels:  model-compression
DLCV2018SPRING
Deep Learning for Computer Vision (CommE 5052) in NTU
Stars: ✭ 38 (-90%)
Mutual labels:  model-compression

Data-Efficient Model Compression

This repo is the Pytorch implementation of Data-Efficient Model Compression.

Background

Many attempts have been done to extend the great success of convolutional neural networks (CNNs) achieved on high-end GPU servers to portable devices such as smart phones. Providing compression and acceleration service of deep learning models on the cloud is therefore of significance and is attractive for end users. However, existing network compression and acceleration approaches usually fine-tuning the svelte model by requesting the entire original training data (e.g. ImageNet), which could be more cumbersome than the network itself and cannot be easily uploaded to the cloud. Therefore, data-efficient neural network compression becomes a hopspot.

DAFL

ICCV 2019 paper DAFL: Data-Free Learning of Student Networks

DAFL is a compression method without using training data. More details can be found at DAFL.

PU Compression

NeurIPS 2019 paper Positive-Unlabeled Compression on the Cloud.

PU Compression is a compression method with little training data. More details can be found at pu_compress.

Leaderboard

Method Used Data Acc. (MNIST) Acc. (CIFAR-10)
Teacher Original Data 98.9 95.6
Student Original Data 98.9 94.4
Meta-data Meta Data 92.5 --
PU Compression PU Data (1/500) 98.9 93.8
DAFL No Data 98.2 92.2
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].