All Projects → lilianweng → Unified Gan Tensorflow

lilianweng / Unified Gan Tensorflow

A Tensorflow implementation of GAN, WGAN and WGAN with gradient penalty.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Unified Gan Tensorflow

Gan Tutorial
Simple Implementation of many GAN models with PyTorch.
Stars: ✭ 227 (+144.09%)
Mutual labels:  gan, wgan, wgan-gp
Pytorch Generative Model Collections
Collection of generative models in Pytorch version.
Stars: ✭ 2,296 (+2368.82%)
Mutual labels:  gan, wgan, wgan-gp
Awesome Gans
Awesome Generative Adversarial Networks with tensorflow
Stars: ✭ 585 (+529.03%)
Mutual labels:  gan, wgan, wgan-gp
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+3969.89%)
Mutual labels:  gan, wgan, wgan-gp
Tf.gans Comparison
Implementations of (theoretical) generative adversarial networks and comparison without cherry-picking
Stars: ✭ 477 (+412.9%)
Mutual labels:  gan, wgan, wgan-gp
progressive growing of GANs
Pure tensorflow implementation of progressive growing of GANs
Stars: ✭ 31 (-66.67%)
Mutual labels:  wgan, wgan-gp
Generative-Model
Repository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-29.03%)
Mutual labels:  wgan, wgan-gp
Improved-Wasserstein-GAN-application-on-MRI-images
Improved Wasserstein GAN (WGAN-GP) application on medical (MRI) images
Stars: ✭ 23 (-75.27%)
Mutual labels:  wgan, wgan-gp
GANs-Keras
GANs Implementations in Keras
Stars: ✭ 24 (-74.19%)
Mutual labels:  gan, wgan
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-74.19%)
Mutual labels:  wgan, wgan-gp
chainer-wasserstein-gan
Chainer implementation of the Wesserstein GAN
Stars: ✭ 20 (-78.49%)
Mutual labels:  gan, wgan
Pytorch-Basic-GANs
Simple Pytorch implementations of most used Generative Adversarial Network (GAN) varieties.
Stars: ✭ 101 (+8.6%)
Mutual labels:  wgan, wgan-gp
Fun-with-MNIST
Playing with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-75.27%)
Mutual labels:  wgan, wgan-gp
WGAN-GP-tensorflow
Tensorflow Implementation of Paper "Improved Training of Wasserstein GANs"
Stars: ✭ 23 (-75.27%)
Mutual labels:  wgan, wgan-gp
Rnn.wgan
Code for training and evaluation of the model from "Language Generation with Recurrent Generative Adversarial Networks without Pre-training"
Stars: ✭ 252 (+170.97%)
Mutual labels:  gan, wgan
wgan-gp
Pytorch implementation of Wasserstein GANs with Gradient Penalty
Stars: ✭ 161 (+73.12%)
Mutual labels:  gan, wgan-gp
Pytorch Gan
A minimal implementaion (less than 150 lines of code with visualization) of DCGAN/WGAN in PyTorch with jupyter notebooks
Stars: ✭ 150 (+61.29%)
Mutual labels:  gan, wgan
Generative Models
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN
Stars: ✭ 438 (+370.97%)
Mutual labels:  gan, wgan
Mimicry
[CVPR 2020 Workshop] A PyTorch GAN library that reproduces research results for popular GANs.
Stars: ✭ 458 (+392.47%)
Mutual labels:  gan, wgan-gp
Dcgan Lsgan Wgan Gp Dragan Tensorflow 2
DCGAN LSGAN WGAN-GP DRAGAN Tensorflow 2
Stars: ✭ 373 (+301.08%)
Mutual labels:  wgan, wgan-gp

Original, Wasserstein, and Wasserstein-Gradient-Penalty DCGAN

(*) This repo is a modification of carpedm20/DCGAN-tensorflow.

(*) The full credit of the model structure design goes to carpedm20/DCGAN-tensorflow.

I started with carpedm20/DCGAN-tensorflow because its DCGAN implementation is not fixed for one dataset, which is not a common setting. Most WGAN and WGAN-GP implementations only work on 'mnist' or one given dataset.

Modifications

A couple of modifications I've made that could be helpful to people who try to implement GAN on their own for the first time.

  1. Added model_type which could be one of 'GAN' (original), 'WGAN' (Wasserstein distance as loss), and 'WGAN_GP' (Wasserstein distance as loss function with gradient penalty), each corresponding to one variation of GAN model.
  2. UnifiedDCGAN can build and train the graph differently according to model_type.
  3. Some model methods were reconstructed so that the code is easier to read through.
  4. Many comments were added for important, or potential confusing functions, like conv and deconv operations in ops.py.

The download.py file stays same as in carpedm20/DCGAN-tensorflow. I keep this file in the repo for the sake of easily fetching dataset for testing.

Reading

If you are interested in the math behind the loss functions of GAN and WGAN, read here.

Related Papers

Test Runs:

(left) python main.py --dataset=mnist --model_type=GAN --batch_size=64 --input_height=28 --output_height=28 --max_iter=10000 --learning_rate=0.0002 --train
(middle) python main.py --dataset=mnist --model_type=WGAN --batch_size=64 --input_height=28 --output_height=28 --d_iter=5 --max_iter=10000 --learning_rate=0.00005 --train
(right) python main.py --dataset=mnist --model_type=WGAN_GP --batch_size=64 --input_height=28 --output_height=28 --d_iter=5 --max_iter=10000 --learning_rate=0.0001 --train

  

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].