ZS-F-VQACode and Data for paper: Zero-shot Visual Question Answering using Knowledge Graph [ ISWC 2021 ]
Stars: ✭ 51 (+96.15%)
Mutual labels: zero-shot
LabelRelaxation-CVPR21Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (+42.31%)
Mutual labels: knowledge-distillation
neural-compressorIntel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+2461.54%)
Mutual labels: knowledge-distillation
ZAQ-codeCVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (+126.92%)
Mutual labels: zero-shot
BAKESelf-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (+203.85%)
Mutual labels: knowledge-distillation
MLIC-KD-WSDMulti-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Stars: ✭ 58 (+123.08%)
Mutual labels: knowledge-distillation
AB distillationKnowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (+303.85%)
Mutual labels: knowledge-distillation
awesome-efficient-gnnCode and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+1815.38%)
Mutual labels: knowledge-distillation
Ask2TransformersA Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+292.31%)
Mutual labels: zero-shot
distill-and-selectAuthors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (+65.38%)
Mutual labels: knowledge-distillation
score-zeroshotSemantically consistent regularizer for zero-shot learning
Stars: ✭ 65 (+150%)
Mutual labels: zero-shot
LDLocalization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (+942.31%)
Mutual labels: knowledge-distillation
head-network-distillation[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (+3.85%)
Mutual labels: knowledge-distillation
mixed-language-trainingAttention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems (AAAI-2020)
Stars: ✭ 29 (+11.54%)
Mutual labels: zero-shot
kdtfKnowledge Distillation using Tensorflow
Stars: ✭ 139 (+434.62%)
Mutual labels: knowledge-distillation
Distill-BERT-TextgenResearch code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (+365.38%)
Mutual labels: knowledge-distillation
MoTISMobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (+130.77%)
Mutual labels: knowledge-distillation
MutualGuideLocalize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (+273.08%)
Mutual labels: knowledge-distillation
SAN[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (+57.69%)
Mutual labels: knowledge-distillation