All Projects → Solacex → Domain-Consensus-Clustering

Solacex / Domain-Consensus-Clustering

Licence: MIT license
[CVPR2021] Domain Consensus Clustering for Universal Domain Adaptation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Domain-Consensus-Clustering

FixBi
FixBi: Bridging Domain Spaces for Unsupervised Domain Adaptation (CVPR 2021)
Stars: ✭ 48 (-43.53%)
Mutual labels:  domain-adaptation, cvpr2021
Awesome-low-level-vision-resources
A curated list of resources for Low-level Vision Tasks
Stars: ✭ 35 (-58.82%)
Mutual labels:  cvpr2021
DeFMO
[CVPR 2021] DeFMO: Deblurring and Shape Recovery of Fast Moving Objects
Stars: ✭ 144 (+69.41%)
Mutual labels:  cvpr2021
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (+348.24%)
Mutual labels:  domain-adaptation
SSTDA
[CVPR 2020] Action Segmentation with Joint Self-Supervised Temporal Domain Adaptation (PyTorch)
Stars: ✭ 150 (+76.47%)
Mutual labels:  domain-adaptation
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+154.12%)
Mutual labels:  domain-adaptation
LBYLNet
[CVPR2021] Look before you leap: learning landmark features for one-stage visual grounding.
Stars: ✭ 46 (-45.88%)
Mutual labels:  cvpr2021
SGGpoint
[CVPR 2021] Exploiting Edge-Oriented Reasoning for 3D Point-based Scene Graph Analysis (official pytorch implementation)
Stars: ✭ 41 (-51.76%)
Mutual labels:  cvpr2021
DAOSL
Implementation of Domain Adaption in One-Shot Learning
Stars: ✭ 14 (-83.53%)
Mutual labels:  domain-adaptation
One-Shot-Face-Swapping-on-Megapixels
One Shot Face Swapping on Megapixels.
Stars: ✭ 260 (+205.88%)
Mutual labels:  cvpr2021
G-SFDA
code for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'
Stars: ✭ 88 (+3.53%)
Mutual labels:  domain-adaptation
pytorch-revgrad
A minimal pytorch package implementing a gradient reversal layer.
Stars: ✭ 142 (+67.06%)
Mutual labels:  domain-adaptation
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+100%)
Mutual labels:  cvpr2021
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-48.24%)
Mutual labels:  domain-adaptation
cfvqa
[CVPR 2021] Counterfactual VQA: A Cause-Effect Look at Language Bias
Stars: ✭ 96 (+12.94%)
Mutual labels:  cvpr2021
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-47.06%)
Mutual labels:  domain-adaptation
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-74.12%)
Mutual labels:  domain-adaptation
Scan2Cap
[CVPR 2021] Scan2Cap: Context-aware Dense Captioning in RGB-D Scans
Stars: ✭ 81 (-4.71%)
Mutual labels:  cvpr2021
Deep-Unsupervised-Domain-Adaptation
Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.
Stars: ✭ 50 (-41.18%)
Mutual labels:  domain-adaptation
StyleGAN-nada
stylegan-nada.github.io/
Stars: ✭ 1,018 (+1097.65%)
Mutual labels:  domain-adaptation

[CVPR2021] Domain Consensus Clustering for Universal Domain Adaptation

[Paper]

Prerequisites

To install requirements:

pip install -r requirements.txt
  • Python 3.6
  • GPU Memory: 10GB
  • Pytorch 1.4.0

Getting Started

Download the dataset: Office-31, OfficeHome, VisDA, DomainNet.

Data Folder structure:

Your dataset DIR:
|-Office/domain_adaptation_images
| |-amazon
| |-webcam
| |-dslr
|-OfficeHome
| |-Art
| |-Product
| |-...
|-VisDA
| |-train
| |-validataion
|-DomainNet
| |-clipart
| |-painting
| |-...

You need you modify the data_path in config files, i.e., config.root

Training

Train on one transfer of Office:

CUDA_VISIBLE_DEVICES=0 python office_run.py note=EXP_NAME setting=uda/osda/pda source=amazon target=dslr

To train on six transfers of Office:

CUDA_VISIBLE_DEVICES=0 python office_run.py note=EXP_NAME setting=uda/osda/pda transfer_all=1

Train on OfficeHome:

CUDA_VISIBLE_DEVICES=0 python officehome_run.py note=EXP_NAME setting=uda/osda/pda source=Art target=Product

or

CUDA_VISIBLE_DEVICES=0 python officehome_run.py note=EXP_NAME setting=uda/osda/pda transfer_all=1 

The final results (including the best and the last) will be saved in the ./snapshot/EXP_NAME/result.txt.

Notably, transfer_all wil consumes more shared memory.

Citation

If you find it helpful, please consider citing:

@inproceedings{li2021DCC,
  title={Domain Consensus Clustering for Universal Domain Adaptation},
  author={Li, Guangrui and Kang, Guoliang and Zhu, Yi and Wei, Yunchao and Yang, Yi},
  booktitle={IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2021}
}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].