All Projects → Yochengliu → MLIC-KD-WSD

Yochengliu / MLIC-KD-WSD

Licence: other
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
Cuda
1817 projects
Jupyter Notebook
11667 projects
CMake
9771 projects
matlab
3953 projects

Projects that are alternatives of or similar to MLIC-KD-WSD

multi-label-classification
machine-learning tensorflow multi-label-classification
Stars: ✭ 27 (-53.45%)
Mutual labels:  multi-label-classification
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Stars: ✭ 189 (+225.86%)
Mutual labels:  knowledge-distillation
Generative MLZSL
[TPAMI Under Submission] Generative Multi-Label Zero-Shot Learning
Stars: ✭ 37 (-36.21%)
Mutual labels:  multi-label-classification
ZS SDL
Official Pytorch Implementation of: "Semantic Diversity Learning for Zero-Shot Multi-label Classification"(ICCV, 2021) paper
Stars: ✭ 22 (-62.07%)
Mutual labels:  multi-label-classification
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (-25.86%)
Mutual labels:  knowledge-distillation
GalaXC
GalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification
Stars: ✭ 28 (-51.72%)
Mutual labels:  multi-label-classification
extremeText
Library for fast text representation and extreme classification.
Stars: ✭ 141 (+143.1%)
Mutual labels:  multi-label-classification
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (+108.62%)
Mutual labels:  knowledge-distillation
mybabe
MyBB CAPTCHA Solver using Convolutional Neural Network in Keras
Stars: ✭ 18 (-68.97%)
Mutual labels:  multi-label-classification
multi-label-text-classification
Mutli-label text classification using ConvNet and graph embedding (Tensorflow implementation)
Stars: ✭ 44 (-24.14%)
Mutual labels:  multi-label-classification
Awesome Project Ideas
Curated list of Machine Learning, NLP, Vision, Recommender Systems Project Ideas
Stars: ✭ 6,114 (+10441.38%)
Mutual labels:  multi-label-classification
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (-53.45%)
Mutual labels:  knowledge-distillation
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+758.62%)
Mutual labels:  knowledge-distillation
C-Tran
General Multi-label Image Classification with Transformers
Stars: ✭ 106 (+82.76%)
Mutual labels:  multi-label-classification
DECAF
DECAF: Deep Extreme Classification with Label Features
Stars: ✭ 46 (-20.69%)
Mutual labels:  multi-label-classification
Caver
Caver: a toolkit for multilabel text classification.
Stars: ✭ 38 (-34.48%)
Mutual labels:  multi-label-classification
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (+36.21%)
Mutual labels:  knowledge-distillation
ImageNet21K
Official Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paper
Stars: ✭ 565 (+874.14%)
Mutual labels:  multi-label-classification
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (-36.21%)
Mutual labels:  knowledge-distillation
MCAR
Learning to Discover Multi-Class Attentional Regions for Multi-Label Image Recognition
Stars: ✭ 32 (-44.83%)
Mutual labels:  multi-label-classification

Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection

This repository contains the code (in Caffe) for the paper:

Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection [ACM DL] [arXiv]
Yongcheng Liu, Lu Sheng, Jing Shao*, Junjie Yan, Shiming Xiang and Chunhong Pan
ACM Multimedia 2018

Project Page: https://yochengliu.github.io/MLIC-KD-WSD/

Weakly Supervised Detection (WSD)

  • We use WSDDN1 as the detection model, i.e., the teacher model.

  • Because the released code of WSDDN is implemented using Matlab (based on MatConvNet), we first reproduce this paper using Caffe.

[1]. Hakan Bilen, Andrea Vedaldi, "Weakly Supervised Deep Detection Networks". In: IEEE Computer Vision and Pattern Recognition, 2016.

Reproduction results

detection

wsddn_det

  • Paper

      training: 5 scales + mirror          testing: fusion of 5 scales + mirror
    
  • Our

      training: 5 scales + mirror          testing: single-forward test
    

classification

wsddn_cls

Datalist Preparation

image_path one_hot_label_vector(e.g., 0 1 1 ...) proposal_info(e.g., x_min y_min x_max y_max score x_min y_min x_max y_max score ...)

Training & Test

    ./wsddn/wsddn_train(deploy).prototxt
  • VGG16 is used as the backbone model.

  • For training, we did not use spatial regularizer. More details can be referred in the paper.

  • For testing, you can use Pycaffe or Matcaffe.

Multi-Label Image Classification (MLIC)

  • The MLIC model in our framework, i.e., the student model, is very compact for efficiency.

  • It is constituted by a popular CNN model (VGG16, as the backbone model) following a fully connected layer (as the classifier).

  • The backbone model of the student could be different from the teacher's.

Cross-Task Knowledge Distillation

Stage 1: Feature-Level Knowledge Transfer

    ./kd/train_stage1.prototxt

Stage 2: Prediction-Level Knowledge Transfer

    ./kd/train_stage2.prototxt

Datalist preparation is the same as mentioned in WSD. More details can be referred in our paper.

Implementation

Please refer to caffe-MLIC for details.

Citation

If our paper is helpful for your research, please consider citing:

    @inproceedings{liu2018mlickdwsd,   
      author = {Yongcheng Liu and    
                Lu Sheng and    
                Jing Shao and   
                Junjie Yan and   
                Shiming Xiang and   
                Chunhong Pan},   
      title = {Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection},   
      booktitle = {ACM International Conference on Multimedia},    
      pages = {700--708},  
      year = {2018}   
    }   

Contact

If you have some ideas or questions about our research to share with us, please contact [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].