All Categories → No Category → knowledge-distillation

Top 29 knowledge-distillation open source projects

Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Knowledge distillation via TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
ACCV TinyGAN
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
MLIC-KD-WSD
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
kdtf
Knowledge Distillation using Tensorflow
1-29 of 29 knowledge-distillation projects