All Projects → Albert0147 → G-SFDA

Albert0147 / G-SFDA

Licence: MIT license
code for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects

Projects that are alternatives of or similar to G-SFDA

Meta-SelfLearning
Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark
Stars: ✭ 157 (+78.41%)
Mutual labels:  domain-adaptation, iccv2021
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-48.86%)
Mutual labels:  domain-adaptation
Dta.pytorch
Official implementation of Drop to Adapt: Learning Discriminative Features for Unsupervised Domain Adaptation, to be presented at ICCV 2019.
Stars: ✭ 144 (+63.64%)
Mutual labels:  domain-adaptation
Learning Via Translation
Image-Image Domain Adaptation with Preserved Self-Similarity and Domain-Dissimilarity for Person Re-identification (https://arxiv.org/pdf/1711.07027.pdf). CVPR2018
Stars: ✭ 202 (+129.55%)
Mutual labels:  domain-adaptation
Squeezesegv2
Implementation of SqueezeSegV2, Improved Model Structure and Unsupervised Domain Adaptation for Road-Object Segmentation from a LiDAR Point Cloud
Stars: ✭ 154 (+75%)
Mutual labels:  domain-adaptation
Intrada
Unsupervised Intra-domain Adaptation for Semantic Segmentation through Self-Supervision (CVPR 2020 Oral)
Stars: ✭ 211 (+139.77%)
Mutual labels:  domain-adaptation
Domain Adaptive Faster Rcnn Pytorch
Domain Adaptive Faster R-CNN in PyTorch
Stars: ✭ 135 (+53.41%)
Mutual labels:  domain-adaptation
pytorch-revgrad
A minimal pytorch package implementing a gradient reversal layer.
Stars: ✭ 142 (+61.36%)
Mutual labels:  domain-adaptation
Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (+181.82%)
Mutual labels:  domain-adaptation
Bnm
code of Towards Discriminability and Diversity: Batch Nuclear-norm Maximization under Label Insufficient Situations (CVPR2020 oral)
Stars: ✭ 192 (+118.18%)
Mutual labels:  domain-adaptation
Crst
Code for <Confidence Regularized Self-Training> in ICCV19 (Oral)
Stars: ✭ 177 (+101.14%)
Mutual labels:  domain-adaptation
Transferlearning Tutorial
《迁移学习简明手册》LaTex源码
Stars: ✭ 2,122 (+2311.36%)
Mutual labels:  domain-adaptation
Ta3n
[ICCV 2019 (Oral)] Temporal Attentive Alignment for Large-Scale Video Domain Adaptation (PyTorch)
Stars: ✭ 217 (+146.59%)
Mutual labels:  domain-adaptation
Cbst
Code for <Domain Adaptation for Semantic Segmentation via Class-Balanced Self-Training> in ECCV18
Stars: ✭ 146 (+65.91%)
Mutual labels:  domain-adaptation
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-50%)
Mutual labels:  domain-adaptation
Cdcl Human Part Segmentation
Repository for Paper: Cross-Domain Complementary Learning Using Pose for Multi-Person Part Segmentation (TCSVT20)
Stars: ✭ 143 (+62.5%)
Mutual labels:  domain-adaptation
Self Similarity Grouping
Self-similarity Grouping: A Simple Unsupervised Cross Domain Adaptation Approach for Person Re-identification (ICCV 2019, Oral)
Stars: ✭ 171 (+94.32%)
Mutual labels:  domain-adaptation
Seg Uncertainty
IJCAI2020 & IJCV 2020 🌇 Unsupervised Scene Adaptation with Memory Regularization in vivo
Stars: ✭ 202 (+129.55%)
Mutual labels:  domain-adaptation
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-75%)
Mutual labels:  domain-adaptation
SSTDA
[CVPR 2020] Action Segmentation with Joint Self-Supervised Temporal Domain Adaptation (PyTorch)
Stars: ✭ 150 (+70.45%)
Mutual labels:  domain-adaptation

Generalized Source-free Domain Adaptation (ICCV 2021)

Code (based on pytorch 1.3, cuda 10.0, please check the 'requirements.txt' for reproducing the results) for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'. [project] [paper].

(Please also check our NeurIPS 2021 paper 'Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation'. [project] [paper] [code], which goes deeper into the neighborhood clustering for SFDA by simply introducing reciprocity.)

Dataset preparing

Download the VisDA and Office-Home (use our provided image list files) dataset. And denote the path of data list in the code.

Training

First train the model on source data with both source and target attention, then adapt the model to target domain in absence of source data. We use embedding layer to automatically produce the domain attention.

sh visda.sh (for VisDA)
sh office-home.sh (for Office-Home)

Checkpoints We provide the training log files, source model and target model on VisDA in this link. You can directly start the source-free adaptation from our source model to reproduce the results.

Domain Classifier

The file 'domain_classifier.ipynb' contains the code for training domain classifier and evaluating the model with estimated domain ID (on VisDA).

Acknowledgement

The codes are based on SHOT (ICML 2020, also source-free).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].