All Projects → FLHonker → ZAQ-code

FLHonker / ZAQ-code

Licence: MIT license
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ZAQ-code

Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+277.97%)
Mutual labels:  quantization, model-compression
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+1047.46%)
Mutual labels:  quantization, model-compression
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-5.08%)
Mutual labels:  quantization, distillation
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (-38.98%)
Mutual labels:  quantization, model-compression
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+193.22%)
Mutual labels:  quantization, model-compression
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-30.51%)
Mutual labels:  quantization, model-compression
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+6272.88%)
Mutual labels:  quantization, distillation
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+113.56%)
Mutual labels:  quantization, model-compression
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+1988.14%)
Mutual labels:  quantization, model-compression
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+1581.36%)
Mutual labels:  quantization, model-compression
Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
Stars: ✭ 84 (+42.37%)
Mutual labels:  model-compression, distillation
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (+91.53%)
Mutual labels:  quantization, model-compression
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+3345.76%)
Mutual labels:  quantization, model-compression
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+1071.19%)
Mutual labels:  quantization, model-compression
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (+83.05%)
Mutual labels:  quantization, model-compression
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+181.36%)
Mutual labels:  quantization, model-compression
Model compression
PyTorch Model Compression
Stars: ✭ 150 (+154.24%)
Mutual labels:  quantization
Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+4591.53%)
Mutual labels:  quantization
Zeroq
[CVPR'20] ZeroQ: A Novel Zero Shot Quantization Framework
Stars: ✭ 150 (+154.24%)
Mutual labels:  quantization
TF2DeepFloorplan
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
Stars: ✭ 98 (+66.1%)
Mutual labels:  quantization

Zero-shot Adversarial Quantization (ZAQ)

[paper] accepted as Oral by CVPR 2021.

Author: Yuang Liu, Wei Zhang, Jun Wang

East China Normal University (ECNU)

Intro

overview
Figure 1. Overview. (a) previous methods; (b) ours.

To address the quantization issue without data, we propose a zero-shot adversarial quantization (ZAQ) framework, facilitating effective discrepancy estimation and knowledge transfer from a full-precision model to its quantized model. This is achieved by a novel two-level discrepancy modeling to drive a generator to synthesize informative and diverse data examples to optimize the quantized model in an adversarial learning fashion.

framework
Figure 2. ZAQ framework

Requirements

  • python>=3.6
  • torch>=1.2
  • torchvision
  • visdom
  • numpy
  • pillow
  • scikit-learn

Usage

To obtain a full-precision model, please refer train.py.

QAT on original dataset:

python quantize.py --model resnet18 --ckpt 'path/' --data_root './data/' --weight_bit 6 --activation_bit 8

Zero-shot quantization without data:

python main.py --model resnet18 --ckpt 'path/' --data_root './data/' --weight_bit 6 --activation_bit 8 

Todo

  • Segmentation networks
  • Object detection networks
  • Quantization supported by PyTorch >= 1.7
  • Mixed-/Arbitrary- precision quantization

Note: This code is temporarily for reference and we will upload a improved version in the future.

Citation

@InProceedings{yuang2021zaq,
    tilte = {Zero-shot Adversarial Quantization},
    author = {Liu, Yuang and Zhang, Wei and Wang, Jun},
    booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    month = {June},
    year = {2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].