All Projects → agrija9 → Deep-Unsupervised-Domain-Adaptation

agrija9 / Deep-Unsupervised-Domain-Adaptation

Licence: other
Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep-Unsupervised-Domain-Adaptation

BIFI
[ICML 2021] Break-It-Fix-It: Unsupervised Learning for Program Repair
Stars: ✭ 74 (+48%)
Mutual labels:  unsupervised-learning, domain-adaptation
Mmt
[ICLR-2020] Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification.
Stars: ✭ 345 (+590%)
Mutual labels:  unsupervised-learning, domain-adaptation
KD3A
Here is the official implementation of the model KD3A in paper "KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation".
Stars: ✭ 63 (+26%)
Mutual labels:  unsupervised-learning, domain-adaptation
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+2598%)
Mutual labels:  unsupervised-learning, domain-adaptation
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+16862%)
Mutual labels:  unsupervised-learning, domain-adaptation
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-10%)
Mutual labels:  unsupervised-learning, domain-adaptation
Joint-Motion-Estimation-and-Segmentation
[MICCAI'18] Joint Learning of Motion Estimation and Segmentation for Cardiac MR Image Sequences
Stars: ✭ 45 (-10%)
Mutual labels:  unsupervised-learning
temporal-ssl
Video Representation Learning by Recognizing Temporal Transformations. In ECCV, 2020.
Stars: ✭ 46 (-8%)
Mutual labels:  unsupervised-learning
pytorch-revgrad
A minimal pytorch package implementing a gradient reversal layer.
Stars: ✭ 142 (+184%)
Mutual labels:  domain-adaptation
SSTDA
[CVPR 2020] Action Segmentation with Joint Self-Supervised Temporal Domain Adaptation (PyTorch)
Stars: ✭ 150 (+200%)
Mutual labels:  domain-adaptation
hmm market behavior
Unsupervised Learning to Market Behavior Forecasting Example
Stars: ✭ 36 (-28%)
Mutual labels:  unsupervised-learning
deepvis
machine learning algorithms in Swift
Stars: ✭ 54 (+8%)
Mutual labels:  unsupervised-learning
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+332%)
Mutual labels:  domain-adaptation
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-56%)
Mutual labels:  domain-adaptation
Unlearning for MRI harmonisation
Code for implementation of Unlearning Scanner Bias for MRI Harmonisation
Stars: ✭ 22 (-56%)
Mutual labels:  domain-adaptation
CGMM
Official Repository of "Contextual Graph Markov Model" (ICML 2018 - JMLR 2020)
Stars: ✭ 35 (-30%)
Mutual labels:  unsupervised-learning
StyleGAN-nada
stylegan-nada.github.io/
Stars: ✭ 1,018 (+1936%)
Mutual labels:  domain-adaptation
CS-DisMo
[ICCVW 2021] Rethinking Content and Style: Exploring Bias for Unsupervised Disentanglement
Stars: ✭ 20 (-60%)
Mutual labels:  unsupervised-learning
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (+662%)
Mutual labels:  domain-adaptation
DAOSL
Implementation of Domain Adaption in One-Shot Learning
Stars: ✭ 14 (-72%)
Mutual labels:  domain-adaptation

Deep-Unsupervised-Domain-Adaptation


Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.

Paper: Evaluation of Deep Neural Network Domain Adaptation Techniques for Image Recognition

Abstract

It has been well proved that deep networks are efficient at extracting features from a given (source) labeled dataset. However, it is not always the case that they can generalize well to other (target) datasets which very often have a different underlying distribution. In this report, we evaluate four different domain adaptation techniques for image classification tasks: Deep CORAL, Deep Domain Confusion (DDC), Conditional Adversarial Domain Adaptation (CDAN) and CDAN with Entropy Conditioning (CDAN+E). The selected domain adaptation techniques are unsupervised techniques where the target dataset will not carry any labels during training phase. The experiments are conducted on the office-31 dataset.

Results

Accuracy performance on the Office31 dataset for the source and domain data distributions (with and without transfer losses).

Deep CORAL DDC
CDAN CDAN+E

Target accuracies for all six domain shifts in Office31 dataset (amazon, webcam and dslr)

Method A → W A → D W → A W → D D → A D → W
No Adaptaion 43.1 ± 2.5 49.2 ± 3.7 35.6 ± 0.6 94.2 ± 3.1 35.4 ± 0.7 90.9 ± 2.4
DeepCORAL 49.5 ± 2.7 40.0 ± 3.3 38.3 ± 0.4 74.4 ± 4.3 38.5 ± 1.5 89.1 ± 4.4
DDC 41.7 ± 9.1 --- --- --- --- ---
CDAN 44.9 ± 3.3 49.5 ± 4.6 34.8 ± 2.4 93.3 ± 3.4 32.9 ± 3.4 88.3 ± 3.8
CDAN+E 48.7 ± 7.5 53.7 ± 4.7 35.3 ± 2.7 93.6 ± 3.4 33.9 ± 2.2 87.7 ± 4.0

Training and inference

To train the model in your computer you must download the Office31 dataset and put it in your data folder.

Execute training of a method by going to its folder (e.g. DeepCORAL):

cd DeepCORAL/
python main.py --epochs 100 --batch_size_source 128 --batch_size_target 128 --name_source amazon --name_target webcam

Loss and accuracy plots

Once the model is trained, you can generate plots like the ones shown above by running:

cd DeepCORAL/
python plot_loss_acc.py --source amazon --target webcam --no_epochs 10

The following is a list of the arguments the usuer can provide:

  • --epochs number of training epochs
  • --batch_size_source batch size of source data
  • --batch_size_target batch size of target data
  • --name_source name of source dataset
  • --name_target name of source dataset
  • --num_classes no. classes in dataset
  • --load_model flag to load pretrained model (AlexNet by default)
  • --adapt_domain bool argument to train with or without specific transfer loss

Requirements

  • tqdm
  • PyTorch
  • matplotlib
  • numpy
  • pickle
  • scikit-image
  • torchvision

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].