All Projects → dkozlov → Awesome Knowledge Distillation

dkozlov / Awesome Knowledge Distillation

Licence: apache-2.0
Awesome Knowledge Distillation

Projects that are alternatives of or similar to Awesome Knowledge Distillation

AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-96.01%)
Mutual labels:  knowledge-distillation, knowledge-transfer
ACCV TinyGAN
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
Stars: ✭ 62 (-97.65%)
Mutual labels:  knowledge-distillation
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (-74.72%)
Mutual labels:  knowledge-distillation
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (-98.14%)
Mutual labels:  knowledge-distillation
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-97.72%)
Mutual labels:  knowledge-distillation
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-98.41%)
Mutual labels:  knowledge-distillation
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (-89.71%)
Mutual labels:  knowledge-distillation
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (-22.82%)
Mutual labels:  knowledge-distillation
FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Stars: ✭ 124 (-95.29%)
Mutual labels:  knowledge-distillation
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-98.97%)
Mutual labels:  knowledge-distillation
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (-75.55%)
Mutual labels:  knowledge-distillation
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-98.29%)
Mutual labels:  knowledge-distillation
WSDM2021 NSM
Improving Multi-hop Knowledge Base Question Answering by Learning Intermediate Supervision Signals. WSDM 2021.
Stars: ✭ 84 (-96.81%)
Mutual labels:  teacher-student
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-96.32%)
Mutual labels:  knowledge-distillation
model optimizer
Model optimizer used in Adlik.
Stars: ✭ 22 (-99.16%)
Mutual labels:  knowledge-distillation
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-98.44%)
Mutual labels:  knowledge-distillation
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-99.2%)
Mutual labels:  knowledge-distillation
Unifiedtransform
A school management Software
Stars: ✭ 2,248 (-14.65%)
Mutual labels:  teacher-student
Knowledge distillation via TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
Stars: ✭ 87 (-96.7%)
Mutual labels:  knowledge-distillation
Efficient-Computing
Efficient-Computing
Stars: ✭ 474 (-82%)
Mutual labels:  knowledge-distillation

Awesome Knowledge Distillation

Papers


Videos


Implementations

MXNet

PyTorch

Lua

Torch

Theano

Lasagne + Theano

Tensorflow

Caffe

Keras

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].