yogeshbalaji / Generate_to_adapt
Implementation of "Generate To Adapt: Aligning Domains using Generative Adversarial Networks"
Stars: ✭ 120
Programming Languages
python
139335 projects - #7 most used programming language
Projects that are alternatives of or similar to Generate to adapt
Lsd Seg
Learning from Synthetic Data: Addressing Domain Shift for Semantic Segmentation
Stars: ✭ 99 (-17.5%)
Mutual labels: gan, domain-adaptation
Man
Multinomial Adversarial Networks for Multi-Domain Text Classification (NAACL 2018)
Stars: ✭ 72 (-40%)
Mutual labels: gan, domain-adaptation
Convolutional Handwriting Gan
ScrabbleGAN: Semi-Supervised Varying Length Handwritten Text Generation (CVPR20)
Stars: ✭ 107 (-10.83%)
Mutual labels: gan, domain-adaptation
Monoculardepth Inference
Inference pipeline for the CVPR paper entitled "Real-Time Monocular Depth Estimation using Synthetic Data with Domain Adaptation via Image Style Transfer" (http://www.atapour.co.uk/papers/CVPR2018.pdf).
Stars: ✭ 115 (-4.17%)
Mutual labels: domain-adaptation
Nmt gan
generative adversarial nets for neural machine translation
Stars: ✭ 110 (-8.33%)
Mutual labels: gan
Exermote
Using Machine Learning to predict the type of exercise from movement data
Stars: ✭ 108 (-10%)
Mutual labels: gan
Pi Rec
🔥 PI-REC: Progressive Image Reconstruction Network With Edge and Color Domain. 🔥 图像翻译,条件GAN,AI绘画
Stars: ✭ 1,619 (+1249.17%)
Mutual labels: gan
Vae Gan Tensorflow
Tensorflow code of "autoencoding beyond pixels using a learned similarity metric"
Stars: ✭ 116 (-3.33%)
Mutual labels: gan
Opencompounddomainadaptation Ocda
Pytorch implementation for "Open Compound Domain Adaptation" (CVPR 2020 ORAL)
Stars: ✭ 114 (-5%)
Mutual labels: domain-adaptation
Tf Exercise Gan
Tensorflow implementation of different GANs and their comparisions
Stars: ✭ 110 (-8.33%)
Mutual labels: gan
What I Have Read
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
Stars: ✭ 110 (-8.33%)
Mutual labels: gan
Sketch To Art
🖼 Create artwork from your casual sketch with GAN and style transfer
Stars: ✭ 115 (-4.17%)
Mutual labels: gan
Deepnudecli
DeepNude Command Line Version With Watermark Removed
Stars: ✭ 112 (-6.67%)
Mutual labels: gan
Msg Gan V1
MSG-GAN: Multi-Scale Gradients GAN (Architecture inspired from ProGAN but doesn't use layer-wise growing)
Stars: ✭ 116 (-3.33%)
Mutual labels: gan
Iros20 6d Pose Tracking
[IROS 2020] se(3)-TrackNet: Data-driven 6D Pose Tracking by Calibrating Image Residuals in Synthetic Domains
Stars: ✭ 113 (-5.83%)
Mutual labels: domain-adaptation
Generate_To_Adapt
Implementation of "Generate To Adapt: Aligning Domains using Generative Adversarial Networks" in PyTorch
Datasets:
Please download the dataset from http://www.cs.umd.edu/~yogesh/datasets/digits.zip and extract it. This folder contains the dataset in the same format as need by our code.
Training:
Let us train the Lenet model for SVHN->MNIST Domain adaptation. Obtain the baseline numbers by running
python main.py --dataroot [path to the dataset] --method sourceonly
To train our method(GTA), run
python main.py --dataroot [path to the dataset] --method GTA
This code trains and stores the trained models in result folder. Current checkpoint and the model that gives best performance on the validation set are stored.
Evaluation:
To evaluate the trained models on the target domain (MNIST), run
python eval.py --dataroot [path to the dataset] --method GTA --model_best False
Citation:
If you use this code for your research, please cite
@article{Gen2Adapt,
author = {Swami Sankaranarayanan and
Yogesh Balaji and
Carlos D. Castillo and
Rama Chellappa},
title = {Generate To Adapt: Aligning Domains using Generative Adversarial Networks},
journal = {CoRR},
volume = {abs/1704.01705},
year = {2017},
url = {http://arxiv.org/abs/1704.01705},
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].