All Projects → FengHZ → KD3A

FengHZ / KD3A

Licence: MIT License
Here is the official implementation of the model KD3A in paper "KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation".

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to KD3A

TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-28.57%)
Mutual labels:  transfer-learning, unsupervised-learning, domain-adaptation
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+13361.9%)
Mutual labels:  transfer-learning, unsupervised-learning, domain-adaptation
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+2041.27%)
Mutual labels:  transfer-learning, unsupervised-learning, domain-adaptation
temporal-ssl
Video Representation Learning by Recognizing Temporal Transformations. In ECCV, 2020.
Stars: ✭ 46 (-26.98%)
Mutual labels:  transfer-learning, unsupervised-learning
Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (+293.65%)
Mutual labels:  transfer-learning, domain-adaptation
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-65.08%)
Mutual labels:  transfer-learning, domain-adaptation
Transferlearning Tutorial
《迁移学习简明手册》LaTex源码
Stars: ✭ 2,122 (+3268.25%)
Mutual labels:  transfer-learning, domain-adaptation
cmd
Central Moment Discrepancy for Domain-Invariant Representation Learning (ICLR 2017, keras)
Stars: ✭ 53 (-15.87%)
Mutual labels:  transfer-learning, domain-adaptation
Deep-Unsupervised-Domain-Adaptation
Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.
Stars: ✭ 50 (-20.63%)
Mutual labels:  unsupervised-learning, domain-adaptation
Transfer-learning-materials
resource collection for transfer learning!
Stars: ✭ 213 (+238.1%)
Mutual labels:  transfer-learning, domain-adaptation
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+1177.78%)
Mutual labels:  transfer-learning, unsupervised-learning
Awesome Domain Adaptation
A collection of AWESOME things about domian adaptation
Stars: ✭ 3,357 (+5228.57%)
Mutual labels:  transfer-learning, domain-adaptation
Transformers-Domain-Adaptation
Adapt Transformer-based language models to new text domains
Stars: ✭ 67 (+6.35%)
Mutual labels:  transfer-learning, domain-adaptation
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (-58.73%)
Mutual labels:  transfer-learning, domain-adaptation
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (+504.76%)
Mutual labels:  transfer-learning, domain-adaptation
Seg Uncertainty
IJCAI2020 & IJCV 2020 🌇 Unsupervised Scene Adaptation with Memory Regularization in vivo
Stars: ✭ 202 (+220.63%)
Mutual labels:  transfer-learning, domain-adaptation
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+28.57%)
Mutual labels:  transfer-learning, unsupervised-learning
transfer-learning-algorithms
Implementation of many transfer learning algorithms in Python with Jupyter notebooks
Stars: ✭ 42 (-33.33%)
Mutual labels:  transfer-learning, domain-adaptation
Shot
code released for our ICML 2020 paper "Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation"
Stars: ✭ 134 (+112.7%)
Mutual labels:  transfer-learning, domain-adaptation
Complete Life Cycle Of A Data Science Project
Complete-Life-Cycle-of-a-Data-Science-Project
Stars: ✭ 140 (+122.22%)
Mutual labels:  transfer-learning, unsupervised-learning

KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation (Accepted at ICML 2021)

Here is the official implementation of the model KD3A in paper KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation.

Model Review

  • Knowledge Distillation

    KD

  • Knowledge Vote

    KV

Setup

Install Package Dependencies

Python Environment: >= 3.6
torch >= 1.2.0
torchvision >= 0.4.0
tensorbard >= 2.0.0
numpy
yaml

Install Datasets

We need users to declare a base path to store the dataset as well as the log of training procedure. The directory structure should be

base_path
│       
└───dataset
│   │   DigitFive
│       │   mnist_data.mat
│       │   mnistm_with_label.mat
|       |   svhn_train_32x32.mat  
│       │   ...
│   │   DomainNet
│       │   ...
│   │   OfficeCaltech10
│       │   ...
|   |   Office31
|       |   ...
|   |   AmazonReview
|       |   ...
└───trained_model_1
│   │	parmater
│   │	runs
└───trained_model_2
│   │	parmater
│   │	runs
...
└───trained_model_n
│   │	parmater
│   │	runs    

Our framework now support five multi-source domain adaptation datasets: DigitFive, DomainNet, AmazonReview, OfficeCaltech10 and Office31.

  • DigitFive

    The DigitFive dataset can be accessed in Google Drive.

  • DomainNet

    VisDA2019 provides the DomainNet dataset.

  • AmazonReview

    The AmazonReview dataset can be accessed in Google Drive.

Unsupervised Multi-source Domain Adaptation

The configuration files can be found under the folder ./config, and we provide four config files with the format .yaml. To perform the unsupervised multi-source decentralized domain adaptation on the specific dataset (e.g., DomainNet), please use the following commands:

python main.py --config DomainNet.yaml --target-domain clipart -bp base_path

The training process for DomainNet is as follows.

top1

top5

Negative Transfer

In training process, our model will record the domain weights as well as the accuracy for target domain as

Source Domains  :['infograph', 'painting', 'quickdraw', 'real', 'sketch']

Domain Weight : [0.1044, 0.3263, 0.0068, 0.2531, 0.2832]

Target Domain clipart Accuracy Top1 : 0.726 Top5: 0.902
  • Irrelevant Domains

    We view quickdraw as the irrelevant domain, and the K3DA assigns low weights to it in training process.

  • Malicious Domains

    We use the poisoning attack with level $m%$ to create malicious domains. The related settings in the configuration files is as follows:

    UMDAConfig:
        malicious:
          attack_domain: "real"
          attack_level: 0.3
    

    With this setting, we will perform poisoning attack in the source domain real with $30%$ mislabeled samples.

Communication Rounds

We also provide the settings in .yaml config files to perform model aggregation with communication rounds $r$ as follows:

UMDAConfig:
    communication_rounds: 1

The communication rounds can be set into $[0.2, 0.5 , 1 , ... , N]$.

Reference

If you find this useful in your work please consider citing:

@InProceedings{pmlr-v139-feng21f,
  title = 	 {KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation},
  author =       {Feng, Haozhe and You, Zhaoyang and Chen, Minghao and Zhang, Tianye and Zhu, Minfeng and Wu, Fei and Wu, Chao and Chen, Wei},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {3274--3283},
  year = 	 {2021},
  editor = 	 {Meila, Marina and Zhang, Tong},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].