All Projects → adler-j → minimal_wgan

adler-j / minimal_wgan

Licence: MIT License
A minimal implementation of Wasserstein GAN

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to minimal wgan

Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+8502.27%)
Mutual labels:  mnist, wgan
Pytorch Generative Model Collections
Collection of generative models in Pytorch version.
Stars: ✭ 2,296 (+5118.18%)
Mutual labels:  mnist, wgan
Tf Exercise Gan
Tensorflow implementation of different GANs and their comparisions
Stars: ✭ 110 (+150%)
Mutual labels:  mnist, wgan
Gan Tutorial
Simple Implementation of many GAN models with PyTorch.
Stars: ✭ 227 (+415.91%)
Mutual labels:  mnist, wgan
Generative adversarial networks 101
Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (+213.64%)
Mutual labels:  mnist, wgan
Fun-with-MNIST
Playing with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-47.73%)
Mutual labels:  mnist, wgan
AdaBound-tensorflow
An optimizer that trains as fast as Adam and as good as SGD in Tensorflow
Stars: ✭ 44 (+0%)
Mutual labels:  mnist
MNIST-multitask
6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (-22.73%)
Mutual labels:  mnist
GAN-Anime-Characters
Applied several Generative Adversarial Networks (GAN) techniques such as: DCGAN, WGAN and StyleGAN to generate Anime Faces and Handwritten Digits.
Stars: ✭ 43 (-2.27%)
Mutual labels:  wgan
deeplearning-mpo
Replace FC2, LeNet-5, VGG, Resnet, Densenet's full-connected layers with MPO
Stars: ✭ 26 (-40.91%)
Mutual labels:  mnist
mnist test
mnist with Tensorflow
Stars: ✭ 30 (-31.82%)
Mutual labels:  mnist
CNN Own Dataset
CNN example for training your own datasets.
Stars: ✭ 25 (-43.18%)
Mutual labels:  mnist
mnist-challenge
My solution to TUM's Machine Learning MNIST challenge 2016-2017 [winner]
Stars: ✭ 68 (+54.55%)
Mutual labels:  mnist
Open-Set-Recognition
Open Set Recognition
Stars: ✭ 49 (+11.36%)
Mutual labels:  mnist
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+50%)
Mutual labels:  mnist
tensorflow-mnist-convnets
Neural nets for MNIST classification, simple single layer NN, 5 layer FC NN and convolutional neural networks with different architectures
Stars: ✭ 22 (-50%)
Mutual labels:  mnist
GANs-Keras
GANs Implementations in Keras
Stars: ✭ 24 (-45.45%)
Mutual labels:  wgan
mnist-flask
A Flask web app for handwritten digit recognition using machine learning
Stars: ✭ 34 (-22.73%)
Mutual labels:  mnist
rust-simple-nn
Simple neural network implementation in Rust
Stars: ✭ 24 (-45.45%)
Mutual labels:  mnist
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-59.09%)
Mutual labels:  mnist

Minimal Wasserstein GAN

This is a simple TensorFlow implementation of Wasserstein Generative Advesarial Networks applied to MNIST.

Some example generated digits:

WGAN results

How to run

Simply run the file wgan_mnist.py. Results will be displayed in real time, while full training takes about an hour using a GPU.

Implementation details

The implementation follows Improved Training of Wasserstein GANs, using the network from the accompanying code. In particular both the generator and discriminator uses 3 convolutional layers with 5x5 convolutions.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].