All Projects → da2so → Zero-shot_Knowledge_Distillation_Pytorch

da2so / Zero-shot_Knowledge_Distillation_Pytorch

Licence: other
ZSKD with PyTorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Zero-shot Knowledge Distillation Pytorch

ZS-F-VQA
Code and Data for paper: Zero-shot Visual Question Answering using Knowledge Graph [ ISWC 2021 ]
Stars: ✭ 51 (+96.15%)
Mutual labels:  zero-shot
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (+42.31%)
Mutual labels:  knowledge-distillation
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+2461.54%)
Mutual labels:  knowledge-distillation
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (+126.92%)
Mutual labels:  zero-shot
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (+203.85%)
Mutual labels:  knowledge-distillation
MLIC-KD-WSD
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Stars: ✭ 58 (+123.08%)
Mutual labels:  knowledge-distillation
Awesome Knowledge Distillation
Awesome Knowledge Distillation
Stars: ✭ 2,634 (+10030.77%)
Mutual labels:  knowledge-distillation
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (+303.85%)
Mutual labels:  knowledge-distillation
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+1815.38%)
Mutual labels:  knowledge-distillation
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+292.31%)
Mutual labels:  zero-shot
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (+65.38%)
Mutual labels:  knowledge-distillation
score-zeroshot
Semantically consistent regularizer for zero-shot learning
Stars: ✭ 65 (+150%)
Mutual labels:  zero-shot
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (+942.31%)
Mutual labels:  knowledge-distillation
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (+3.85%)
Mutual labels:  knowledge-distillation
mixed-language-training
Attention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems (AAAI-2020)
Stars: ✭ 29 (+11.54%)
Mutual labels:  zero-shot
kdtf
Knowledge Distillation using Tensorflow
Stars: ✭ 139 (+434.62%)
Mutual labels:  knowledge-distillation
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (+365.38%)
Mutual labels:  knowledge-distillation
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (+130.77%)
Mutual labels:  knowledge-distillation
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (+273.08%)
Mutual labels:  knowledge-distillation
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (+57.69%)
Mutual labels:  knowledge-distillation

Zero-Shot Knowledge Distillation in Deep Networks Pytorch

Python version support PyTorch version support

Star us on GitHub — it helps!!

PyTorch implementation for Zero-Shot Knowledge Distillation in Deep Networks

Install

You will need a machine with a GPU and CUDA installed.
Then, you prepare runtime environment:

pip install -r requirements.txt

Use

For mnist dataset,

python main.py --dataset=mnist --t_train=False --num_sample=12000 --batch_size=200 

For cifar10 dataset,

python main.py --dataset=cifar10 --t_train=False --num_sample=24000 --batch_size=100

Arguments:

  • dataset - available dataset: ['mnist', 'cifar10', 'cifar100']
  • t_train - Train teacher network??
    • if True, train teacher network
    • elif False, load trained teacher network
  • num_sample - Number of DIs crafted per category
  • beta - Beta scaling vectors
  • batch_size - batch size
  • lr - learning rate
  • iters - iteration number
  • s_save_path - save path for student network
  • do_genimgs - generate synthesized images from ZSKD??
    • if True, generate images
    • elif False, you must have the synthesized images that are generated from ZSKD

Result examples for MNIST dataset

2

Understanding this method(algorithm)

Check my blog!! Here

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].