All Projects → tim-learn → SHOT-plus

tim-learn / SHOT-plus

Licence: MIT License
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to SHOT-plus

Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+18336.96%)
Mutual labels:  transfer-learning, domain-adaptation, self-supervised-learning
BA3US
code for our ECCV 2020 paper "A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation"
Stars: ✭ 31 (-32.61%)
Mutual labels:  transfer-learning, domain-adaptation
Meta-SelfLearning
Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark
Stars: ✭ 157 (+241.3%)
Mutual labels:  domain-adaptation, multi-source-domain-adaptation
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (-43.48%)
Mutual labels:  transfer-learning, domain-adaptation
SimPLE
Code for the paper: "SimPLE: Similar Pseudo Label Exploitation for Semi-Supervised Classification"
Stars: ✭ 50 (+8.7%)
Mutual labels:  semi-supervised-learning, transfer-learning
exponential-moving-average-normalization
PyTorch implementation of EMAN for self-supervised and semi-supervised learning: https://arxiv.org/abs/2101.08482
Stars: ✭ 76 (+65.22%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+1650%)
Mutual labels:  transfer-learning, self-supervised-learning
temporal-ssl
Video Representation Learning by Recognizing Temporal Transformations. In ECCV, 2020.
Stars: ✭ 46 (+0%)
Mutual labels:  transfer-learning, self-supervised-learning
transfer-learning-algorithms
Implementation of many transfer learning algorithms in Python with Jupyter notebooks
Stars: ✭ 42 (-8.7%)
Mutual labels:  transfer-learning, domain-adaptation
cmd
Central Moment Discrepancy for Domain-Invariant Representation Learning (ICLR 2017, keras)
Stars: ✭ 53 (+15.22%)
Mutual labels:  transfer-learning, domain-adaptation
improving segmentation with selfsupervised depth
[CVPR21] Implementation of our work "Three Ways to Improve Semantic Segmentation with Self-Supervised Depth Estimation"
Stars: ✭ 189 (+310.87%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
Transfer-learning-materials
resource collection for transfer learning!
Stars: ✭ 213 (+363.04%)
Mutual labels:  transfer-learning, domain-adaptation
visda2019-multisource
Source code of our submission (Rank 1) for Multi-Source Domain Adaptation task in VisDA-2019
Stars: ✭ 49 (+6.52%)
Mutual labels:  domain-adaptation, multi-source-domain-adaptation
sesemi
supervised and semi-supervised image classification with self-supervision (Keras)
Stars: ✭ 43 (-6.52%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+76.09%)
Mutual labels:  transfer-learning, self-supervised-learning
Transformers-Domain-Adaptation
Adapt Transformer-based language models to new text domains
Stars: ✭ 67 (+45.65%)
Mutual labels:  transfer-learning, domain-adaptation
SSL CR Histo
Official code for "Self-Supervised driven Consistency Training for Annotation Efficient Histopathology Image Analysis" Published in Medical Image Analysis (MedIA) Journal, Oct, 2021.
Stars: ✭ 32 (-30.43%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-52.17%)
Mutual labels:  transfer-learning, domain-adaptation
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (+728.26%)
Mutual labels:  transfer-learning, domain-adaptation
DualStudent
Code for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]
Stars: ✭ 106 (+130.43%)
Mutual labels:  semi-supervised-learning, domain-adaptation

Official implementation for SHOT++

[TPAMI-2021] Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer

Framework:

  1. train on the source domain; (Section 3.1)
  2. hypothesis transfer with information maximization and self-supervised learning; (Section 3.2 & Section 3.3) (note that SHOT here means results after step 2, which contains an additional rotation-driven self-supervised objective compared with the original SHOT in ICML 2020)

  1. labeling transfer with semi-supervised learning. (Section 3.4) (note that SHOT++ has an extra semi-supervised learning step via MixMatch)

Prerequisites:

  • python == 3.6.8
  • pytorch ==1.1.0
  • torchvision == 0.3.0
  • numpy, scipy, sklearn, PIL, argparse, tqdm

Dataset:

  • Please manually download the datasets Office, Office-Home, VisDA-C, Office-Caltech from the official websites, and modify the path of images in each '.txt' under the folder './object/data/'. [**How to generate such txt files could be found in https://github.com/tim-learn/Generate_list **]

  • Concerning the Digits dsatasets, the code will automatically download three digit datasets (i.e., MNIST, USPS, and SVHN) in './digit/data/'.

Training:

  1. Unsupervised Closed-set Domain Adaptation (UDA) on the Digits dataset
    • MNIST -> USPS (m2u)
     cd digit/
     python uda_digit.py --gpu_id 0 --seed 2021 --dset m2u --output ckps_digits --cls_par 0.1 --ssl 0.2 
     python digit_mixmatch.py --gpu_id 0 --seed 2021 --dset m2u --output ckps_mm --output_tar ckps_digits --cls_par 0.1 --ssl 0.2 --alpha 0.1
  2. Unsupervised Closed-set Domain Adaptation (UDA) on the Office/ Office-Home dataset
    • Train model on the source domain A (s = 0) [--max_epoch 50 for Office-Home]
    cd uda/
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da uda --output ckps/source/ --dset office --max_epoch 100 --s 0
    • Adaptation to other target domains D and W (hypothesis transfer)
    python image_target.py --gpu_id 0 --seed 2021 --da uda --output ckps/target/ --dset office --s 0 --cls_par 0.3 --ssl 0.6
    • Adaptation to other target domains D and W (following labeling transfer) [--max_epoch 50 for Office-Home]
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da uda --dset office --max_epoch 100 --s 0 --output_tar ckps/target/ --output ckps/mixmatch/ --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0
  3. Unsupervised Closed-set Domain Adaptation (UDA) on the VISDA-C dataset
    • Train model on the Synthetic domain [--max_epoch 10 --lr 1e-3]
    cd uda/
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da uda --output ckps/source/ --dset VISDA-C --net resnet101 --lr 1e-3 --max_epoch 10 --s 0
    • Adaptation to the real domain (hypothesis transfer)
    python image_target.py --gpu_id 0 --seed 2021 --da uda --output ckps/target/ --dset VISDA-C --s 0 --net resnet101 --cls_par 0.3 --ssl 0.6
    • Adaptation to the real domain (following labeling transfer)
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da uda --dset VISDA-C --max_epoch 10 --s 0 --output_tar ckps/target/ --output ckps/mixmatch/ --net resnet101 --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0
  4. Unsupervised Partial-set Domain Adaptation (PDA) on the Office-Home dataset
    • Train model on the source domain A (s = 0)
    cd pda/
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da pda --output ckps/source/ --dset office-home --max_epoch 50 --s 0
    • Adaptation to other target domains (hypothesis transfer)
    python image_target.py --gpu_id 0 --seed 2021 --da pda --dset office-home --s 0 --output_src ckps/source/ --output ckps/target/ --cls_par 0.3 --ssl 0.6
    • Adaptation to the real domain (following labeling transfer)
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da pda --dset office-home --max_epoch 50 --s 0 --output_tar ckps/target/ --output ckps/mixmatch/ --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0
  5. Unsupervised Multi-source Domain Adaptation (MSDA) on the Office-Home dataset
    • Train model on the source domains Ar (s = 0), Cl (s = 1), Pr (s = 2), respectively
    cd msda/
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da uda --dset office-home --output ckps/source/ --net resnet50 --max_epoch 50 --s 0
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da uda --dset office-home --output ckps/source/ --net resnet50 --max_epoch 50 --s 1
    python image_source.py --gpu_id 0 --seed 2021 --trte val --da uda --dset office-home --output ckps/source/ --net resnet50 --max_epoch 50 --s 2
    • Adaptation to the target domain (hypothesis transfer)
    python image_target.py --gpu_id 0 --seed 2021 --cls_par 0.3 --ssl 0.6 --da uda --dset office-home --output_src ckps/source/ --output ckps/target/ --net resnet50 --s 0
    python image_target.py --gpu_id 0 --seed 2021 --cls_par 0.3 --ssl 0.6 --da uda --dset office-home --output_src ckps/source/ --output ckps/target/ --net resnet50 --s 1
    python image_target.py --gpu_id 0 --seed 2021 --cls_par 0.3 --ssl 0.6 --da uda --dset office-home --output_src ckps/source/ --output ckps/target/ --net resnet50 --s 2
    • Adaptation to the target domain (labeling transfer)
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da uda --dset office-home --max_epoch 50 --output_tar ckps/target/ --output ckps/mixmatch/ --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0 --net resnet50 --s 0
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da uda --dset office-home --max_epoch 50 --output_tar ckps/target/ --output ckps/mixmatch/ --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0 --net resnet50 --s 1
    python image_mixmatch.py --gpu_id 0 --seed 2021 --da uda --dset office-home --max_epoch 50 --output_tar ckps/target/ --output ckps/mixmatch/ --cls_par 0.3 --ssl 0.6 --choice ent --ps 0.0 --net resnet50 --s 2 
    • Combine domain-spetific scores together
    python image_ms.py --gpu_id 0 --seed 2021 --cls_par 0.3 --ssl 0.6 --da uda --dset office-home --output_src ckps/source/ --output ckps/target/ --output_mm ckps/mixmatch/ --net resnet50 --t 3
  6. Semi-supervised Domain Adaptation (SSDA) on the Office-Home dataset
    • Train model on the source domain Ar (s = 0)
    cd ssda/
    python image_source.py --gpu_id 0 --seed 2021 --output ckps/source/ --dset office-home --max_epoch 50 --s 0
    • Adaptation to the target domain Cl (t = 1) [hypothesis transfer]
    python image_target.py --gpu_id 0 --seed 2021 --cls_par 0.1 --ssl 0.2 --output_src ckps/source --output ckps/target --dset office-home --s 0 --t 1 
    • Adaptation to the target domain Cl (t = 1) [labeling transfer]
    python image_mixmatch.py --gpu_id 0 --seed 2021 --ps 0.0 --cls_par 0.1 --ssl 0.2 --output_tar ckps/target --output ckps/mixmatch --dset office-home --max_epoch 50 --s 0 --t 1

Please refer ./xxda/run_xxda.sh for all the settings for different methods and scenarios.

Citation

If you find this code useful for your research, please cite our papers

@article{liang2021source,  
 title={Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer}, 
 author={Liang, Jian and Hu, Dapeng and Wang, Yunbo and He, Ran and Feng, Jiashi},   
 journal={IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
 year={2021}, 
 note={In Press}  
}

@inproceedings{liang2020we, 
 title={Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation}, 
 author={Liang, Jian and Hu, Dapeng and Feng, Jiashi}, 
 booktitle={International Conference on Machine Learning (ICML)},  
 pages={6028--6039},
 year={2020}
}

Contact

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].