All Projects → jalola → Improved Wgan Pytorch

jalola / Improved Wgan Pytorch

Licence: mit
Improved WGAN in Pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Improved Wgan Pytorch

Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+3233.33%)
Mutual labels:  wgan-gp
WGAN GP
Keras model and tensorflow optimization of 'improved Training of Wasserstein GANs'
Stars: ✭ 16 (-95.2%)
Mutual labels:  wgan-gp
tfworldhackathon
GitHub repo for my Tensorflow World hackathon submission
Stars: ✭ 17 (-94.89%)
Mutual labels:  wgan-gp
Dcgan Lsgan Wgan Gp Dragan Pytorch
DCGAN LSGAN WGAN-GP DRAGAN PyTorch
Stars: ✭ 134 (-59.76%)
Mutual labels:  wgan-gp
Fun-with-MNIST
Playing with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-93.09%)
Mutual labels:  wgan-gp
progressive growing of GANs
Pure tensorflow implementation of progressive growing of GANs
Stars: ✭ 31 (-90.69%)
Mutual labels:  wgan-gp
Unified Gan Tensorflow
A Tensorflow implementation of GAN, WGAN and WGAN with gradient penalty.
Stars: ✭ 93 (-72.07%)
Mutual labels:  wgan-gp
wgan-gp
Pytorch implementation of Wasserstein GANs with Gradient Penalty
Stars: ✭ 161 (-51.65%)
Mutual labels:  wgan-gp
Pytorch-Basic-GANs
Simple Pytorch implementations of most used Generative Adversarial Network (GAN) varieties.
Stars: ✭ 101 (-69.67%)
Mutual labels:  wgan-gp
WGAN-GP-TensorFlow
TensorFlow implementations of Wasserstein GAN with Gradient Penalty (WGAN-GP), Least Squares GAN (LSGAN), GANs with the hinge loss.
Stars: ✭ 42 (-87.39%)
Mutual labels:  wgan-gp
Dcgan wgan wgan Gp lsgan sngan rsgan began acgan pggan tensorflow
Implementation of some different variants of GANs by tensorflow, Train the GAN in Google Cloud Colab, DCGAN, WGAN, WGAN-GP, LSGAN, SNGAN, RSGAN, RaSGAN, BEGAN, ACGAN, PGGAN, pix2pix, BigGAN
Stars: ✭ 166 (-50.15%)
Mutual labels:  wgan-gp
Gan Tutorial
Simple Implementation of many GAN models with PyTorch.
Stars: ✭ 227 (-31.83%)
Mutual labels:  wgan-gp
Generative-Model
Repository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-80.18%)
Mutual labels:  wgan-gp
Pytorch Gan Collections
PyTorch implementation of DCGAN, WGAN-GP and SNGAN.
Stars: ✭ 128 (-61.56%)
Mutual labels:  wgan-gp
Improved-Wasserstein-GAN-application-on-MRI-images
Improved Wasserstein GAN (WGAN-GP) application on medical (MRI) images
Stars: ✭ 23 (-93.09%)
Mutual labels:  wgan-gp
Ganotebooks
wgan, wgan2(improved, gp), infogan, and dcgan implementation in lasagne, keras, pytorch
Stars: ✭ 1,446 (+334.23%)
Mutual labels:  wgan-gp
speech-enhancement-WGAN
speech enhancement GAN on waveform/log-power-spectrum data using Improved WGAN
Stars: ✭ 35 (-89.49%)
Mutual labels:  wgan-gp
SRGAN-PyTorch
A PyTorch implementation of SRGAN specific for Anime Super Resolution based on "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network". And another PyTorch WGAN-gp implementation of SRGAN referring to "Improved Training of Wasserstein GANs".
Stars: ✭ 65 (-80.48%)
Mutual labels:  wgan-gp
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-92.79%)
Mutual labels:  wgan-gp
WGAN-GP-tensorflow
Tensorflow Implementation of Paper "Improved Training of Wasserstein GANs"
Stars: ✭ 23 (-93.09%)
Mutual labels:  wgan-gp

Improved Training of Wasserstein GANs in Pytorch

This is a Pytorch implementation of gan_64x64.py from Improved Training of Wasserstein GANs.

To do:

  • [x] Support parameters in cli *
  • [x] Add requirements.txt *
  • [ ] Add Dockerfile if possible
  • [x] Multiple GPUs *
  • [x] Clean up code, remove unused code *

* not ready for conditional gan yet

Run

  • Example:

Fresh training

CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py --train_dir /path/to/train --validation_dir /path/to/validation/ --output_path /path/to/output/ --dim 64 --saving_step 300 --num_workers 8

Continued training:

CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py --train_dir /path/to/train --validation_dir /path/to/validation/ --output_path /path/to/output/ --dim 64 --saving_step 300 --num_workers 8 --restore_mode --start_iter 5000

Model

  • train.py: This model is mainly based on GoodGenerator and GoodDiscriminator of gan_64x64.py model from Improved Training of Wasserstein GANs. It has been trained on LSUN dataset for around 100k iters.
  • congan_train.py: ACGAN implementation, trained on 4 classes of LSUN dataset

Result

1. WGAN: trained on bedroom dataset (100k iters)

Sample 1 Sample 2

2. ACGAN: trained on 4 classes (100k iters)

  • dining_room: 1
  • bridge: 2
  • restaurant: 3
  • tower: 4
Sample 1 Sample 2

Testing

During the implementation of this model, we built a test module to compare the result between original model (Tensorflow) and our model (Pytorch) for every layer we implemented. It is available at compare-tensorflow-pytorch

TensorboardX

Results such as costs, generated images (every 200 iters) for tensorboard will be written to ./runs folder.

To display the results to tensorboard, run: tensorboard --logdir runs

Acknowledgements

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].