All Projects → yzd-v → FGD

yzd-v / FGD

Licence: Apache-2.0 License
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to FGD

BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (-36.29%)
Mutual labels:  knowledge-distillation
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-21.77%)
Mutual labels:  knowledge-distillation
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (-60.48%)
Mutual labels:  knowledge-distillation
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (-70.16%)
Mutual labels:  knowledge-distillation
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-66.94%)
Mutual labels:  knowledge-distillation
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-51.61%)
Mutual labels:  knowledge-distillation
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (-65.32%)
Mutual labels:  knowledge-distillation
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-63.71%)
Mutual labels:  knowledge-distillation
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+437.1%)
Mutual labels:  knowledge-distillation
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-78.23%)
Mutual labels:  knowledge-distillation
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (-2.42%)
Mutual labels:  knowledge-distillation
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (+118.55%)
Mutual labels:  knowledge-distillation
Zero-shot Knowledge Distillation Pytorch
ZSKD with PyTorch
Stars: ✭ 26 (-79.03%)
Mutual labels:  knowledge-distillation
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+301.61%)
Mutual labels:  knowledge-distillation
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-83.06%)
Mutual labels:  knowledge-distillation
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Stars: ✭ 189 (+52.42%)
Mutual labels:  knowledge-distillation
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-15.32%)
Mutual labels:  knowledge-distillation
Efficient-Computing
Efficient-Computing
Stars: ✭ 474 (+282.26%)
Mutual labels:  knowledge-distillation
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-66.13%)
Mutual labels:  knowledge-distillation
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+419.35%)
Mutual labels:  knowledge-distillation

FGD

CVPR 2022 Paper: Focal and Global Knowledge Distillation for Detectors

Install MMDetection and MS COCO2017

  • Our codes are based on MMDetection. Please follow the installation of MMDetection and make sure you can run it successfully.
  • This repo uses mmdet==2.11.0 and mmcv-full==1.2.4
  • If you want to use higher mmdet version, you may have to change the optimizer in apis/train.py and build_detector in tools/train.py.
  • For mmdet>=2.12.0, if you want to use inheriting strategy, you have to initalize the student with teacher's parameters after model.init_weights().

Add and Replace the codes

  • Add the configs/. in our codes to the configs/ in mmdetectin's codes.
  • Add the mmdet/distillation/. in our codes to the mmdet/ in mmdetectin's codes.
  • Replace the mmdet/apis/train.py and tools/train.py in mmdetection's codes with mmdet/apis/train.py and tools/train.py in our codes.
  • Add pth_transfer.py to mmdetection's codes.
  • Unzip COCO dataset into data/coco/

Train

#single GPU
python tools/train.py configs/distillers/fgd/fgd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py

#multi GPU
bash tools/dist_train.sh configs/distillers/fgd/fgd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py 8

Transfer

# Tansfer the FGD model into mmdet model
python pth_transfer.py --fgd_path $fgd_ckpt --output_path $new_mmdet_ckpt

Test

#single GPU
python tools/test.py configs/retinanet/retinanet_r50_fpn_2x_coco.py $new_mmdet_ckpt --eval bbox

#multi GPU
bash tools/dist_test.sh configs/retinanet/retinanet_r50_fpn_2x_coco.py $new_mmdet_ckpt 8 --eval bbox

Results

Model Backbone Baseline(mAP) +FGD(mAP) config weight code
RetinaNet ResNet-50 37.4 40.7 config baidu wsfw
RetinaNet ResNet-101 38.9 41.7 config
Faster RCNN ResNet-50 38.4 42.0 config baidu dgpf
Faster RCNN ResNet-101 39.8 44.1 config
RepPoints ResNet-50 38.6 42.0 config baidu qx5d
RepPoints ResNet-101 40.5 43.8 config
FCOS ResNet-50 38.5 42.7 config baidu sedt
MaskRCNN ResNet-50 39.2 42.1 config baidu sv8m
GFL ResNet-50 40.2 43.5 config
Model Backbone Baseline(Mask mAP) +FGD(Mask mAP) config weight code
SOLO ResNet-50 33.1 36.0 config
MaskRCNN ResNet-50 35.4 37.8 config baidu sv8m
Student Teacher Baseline(mAP) +FGD(mAP) config weight code
YOLOX-m YOLOX-l 45.9 46.6 config baidu af9g
  1. Please refer branch yolox

Citation

@article{yang2021focal,
  title={Focal and Global Knowledge Distillation for Detectors},
  author={Yang, Zhendong and Li, Zhe and Jiang, Xiaohu and Gong, Yuan and Yuan, Zehuan and Zhao, Danpei and Yuan, Chun},
  journal={arXiv preprint arXiv:2111.11837},
  year={2021}
}

Acknowledgement

Our code is based on the project MMDetection.

Thanks to the work GCNet and mmetection-distiller.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].