All Projects → Zero-shot_Knowledge_Distillation_Pytorch → Similar Projects or Alternatives

36 Open source projects that are alternatives of or similar to Zero-shot_Knowledge_Distillation_Pytorch

MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (+130.77%)
Mutual labels:  knowledge-distillation
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (+303.85%)
Mutual labels:  knowledge-distillation
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (+273.08%)
Mutual labels:  knowledge-distillation
mixed-language-training
Attention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems (AAAI-2020)
Stars: ✭ 29 (+11.54%)
Mutual labels:  zero-shot
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+2461.54%)
Mutual labels:  knowledge-distillation
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+292.31%)
Mutual labels:  zero-shot
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (+57.69%)
Mutual labels:  knowledge-distillation
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (+942.31%)
Mutual labels:  knowledge-distillation
MLIC-KD-WSD
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Stars: ✭ 58 (+123.08%)
Mutual labels:  knowledge-distillation
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (+365.38%)
Mutual labels:  knowledge-distillation
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (+42.31%)
Mutual labels:  knowledge-distillation
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+1815.38%)
Mutual labels:  knowledge-distillation
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (+203.85%)
Mutual labels:  knowledge-distillation
score-zeroshot
Semantically consistent regularizer for zero-shot learning
Stars: ✭ 65 (+150%)
Mutual labels:  zero-shot
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Stars: ✭ 189 (+626.92%)
Mutual labels:  knowledge-distillation
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (+65.38%)
Mutual labels:  knowledge-distillation
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (+126.92%)
Mutual labels:  zero-shot
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (+3.85%)
Mutual labels:  knowledge-distillation
ZS-F-VQA
Code and Data for paper: Zero-shot Visual Question Answering using Knowledge Graph [ ISWC 2021 ]
Stars: ✭ 51 (+96.15%)
Mutual labels:  zero-shot
kdtf
Knowledge Distillation using Tensorflow
Stars: ✭ 139 (+434.62%)
Mutual labels:  knowledge-distillation
Awesome Knowledge Distillation
Awesome Knowledge Distillation
Stars: ✭ 2,634 (+10030.77%)
Mutual labels:  knowledge-distillation
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+7719.23%)
Mutual labels:  knowledge-distillation
Knowledge distillation via TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
Stars: ✭ 87 (+234.62%)
Mutual labels:  knowledge-distillation
model optimizer
Model optimizer used in Adlik.
Stars: ✭ 22 (-15.38%)
Mutual labels:  knowledge-distillation
ACCV TinyGAN
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
Stars: ✭ 62 (+138.46%)
Mutual labels:  knowledge-distillation
FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Stars: ✭ 124 (+376.92%)
Mutual labels:  knowledge-distillation
Efficient-Computing
Efficient-Computing
Stars: ✭ 474 (+1723.08%)
Mutual labels:  knowledge-distillation
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (+73.08%)
Mutual labels:  knowledge-distillation
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (+61.54%)
Mutual labels:  knowledge-distillation
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-19.23%)
Mutual labels:  knowledge-distillation
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (+88.46%)
Mutual labels:  knowledge-distillation
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (+3.85%)
Mutual labels:  knowledge-distillation
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+2376.92%)
Mutual labels:  knowledge-distillation
Modality-Transferable-MER
Modality-Transferable-MER, multimodal emotion recognition model with zero-shot and few-shot abilities.
Stars: ✭ 36 (+38.46%)
Mutual labels:  zero-shot
NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction"
Stars: ✭ 166 (+538.46%)
Mutual labels:  zero-shot
gpt-j-api
API for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Stars: ✭ 248 (+853.85%)
Mutual labels:  zero-shot
1-36 of 36 similar projects