Pretrained Language ModelPretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+328.9%)
Mutual labels: knowledge-distillation, model-compression
cool-papers-in-pytorchReimplementing cool papers in PyTorch...
Stars: ✭ 21 (-95.57%)
Mutual labels: knowledge-distillation
SAN[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-91.35%)
Mutual labels: knowledge-distillation
ATMC[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-91.35%)
Mutual labels: model-compression
mmrazorOpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+35.86%)
Mutual labels: knowledge-distillation
LDLocalization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (-42.83%)
Mutual labels: knowledge-distillation
Regularization-Pruning[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-90.72%)
Mutual labels: model-compression
MoTISMobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-87.34%)
Mutual labels: knowledge-distillation
MutualGuideLocalize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-79.54%)
Mutual labels: knowledge-distillation
bert-AADAdversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-94.3%)
Mutual labels: knowledge-distillation
neural-compressorIntel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+40.51%)
Mutual labels: knowledge-distillation
SemCKDThis is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-91.14%)
Mutual labels: knowledge-distillation
torch-model-compression针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-73.42%)
Mutual labels: model-compression
FastPosepytorch realtime multi person keypoint estimation
Stars: ✭ 36 (-92.41%)
Mutual labels: model-compression
ESNACLearnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-94.3%)
Mutual labels: model-compression
ProSelfLC-2021noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-90.51%)
Mutual labels: knowledge-distillation
allie🤖 A machine learning framework for audio, text, image, video, or .CSV files (50+ featurizers and 15+ model trainers).
Stars: ✭ 93 (-80.38%)
Mutual labels: model-compression