All Projects → ermongroup → Markov Chain Gan

ermongroup / Markov Chain Gan

Licence: mit
Code for "Generative Adversarial Training for Markov Chains" (ICLR 2017 Workshop)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Markov Chain Gan

GraphCNN-GAN
Graph-convolutional GAN for point cloud generation. Code from ICLR 2019 paper Learning Localized Generative Models for 3D Point Clouds via Graph Convolution
Stars: ✭ 50 (-34.21%)
Mutual labels:  generative-adversarial-network, generative-model
TriangleGAN
TriangleGAN, ACM MM 2019.
Stars: ✭ 28 (-63.16%)
Mutual labels:  generative-adversarial-network, generative-model
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-84.21%)
Mutual labels:  generative-adversarial-network, generative-model
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+264.47%)
Mutual labels:  generative-adversarial-network, generative-model
Conditional Animegan
Conditional GAN for Anime face generation.
Stars: ✭ 70 (-7.89%)
Mutual labels:  generative-adversarial-network, generative-model
pytorch-GAN
My pytorch implementation for GAN
Stars: ✭ 12 (-84.21%)
Mutual labels:  generative-adversarial-network, generative-model
py-msa-kdenlive
Python script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (-67.11%)
Mutual labels:  generative-adversarial-network, generative-model
Triple Gan
See Triple-GAN-V2 in PyTorch: https://github.com/taufikxu/Triple-GAN
Stars: ✭ 203 (+167.11%)
Mutual labels:  generative-adversarial-network, generative-model
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+4081.58%)
Mutual labels:  generative-adversarial-network, generative-model
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+263.16%)
Mutual labels:  generative-adversarial-network, generative-model
MMD-GAN
Improving MMD-GAN training with repulsive loss function
Stars: ✭ 82 (+7.89%)
Mutual labels:  generative-adversarial-network, generative-model
Cadgan
ICML 2019. Turn a pre-trained GAN model into a content-addressable model without retraining.
Stars: ✭ 19 (-75%)
Mutual labels:  generative-adversarial-network, generative-model
Sgan
Stacked Generative Adversarial Networks
Stars: ✭ 240 (+215.79%)
Mutual labels:  generative-adversarial-network, generative-model
simplegan
Tensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-75%)
Mutual labels:  generative-adversarial-network, generative-model
Wgan
Tensorflow Implementation of Wasserstein GAN (and Improved version in wgan_v2)
Stars: ✭ 228 (+200%)
Mutual labels:  generative-adversarial-network, generative-model
pytorch-CycleGAN
Pytorch implementation of CycleGAN.
Stars: ✭ 39 (-48.68%)
Mutual labels:  generative-adversarial-network, generative-model
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+148.68%)
Mutual labels:  generative-adversarial-network, generative-model
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+161.84%)
Mutual labels:  generative-adversarial-network, generative-model
celeba-gan-pytorch
Generative Adversarial Networks in PyTorch
Stars: ✭ 35 (-53.95%)
Mutual labels:  generative-adversarial-network, generative-model
Seqgan
A simplified PyTorch implementation of "SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient." (Yu, Lantao, et al.)
Stars: ✭ 502 (+560.53%)
Mutual labels:  generative-adversarial-network, generative-model

Markov Chain GAN (MGAN)

TensorFlow code for Generative Adversarial Training for Markov Chains (ICLR 2017 Workshop Track).

Work by Jiaming Song, Shengjia Zhao and Stefano Ermon.


Preprocessing

Running the code requires some preprocessing. Namely, we transform the data to TensorFlow Records file to maximize speed (as suggested by TensorFlow).

MNIST

The data used for training is here. Download and place the directory in ~/data/mnist_tfrecords.

(This can be easily done by using a symlink or you can change the path in file models/mnist/__init__.py)

CelebA

The data used for training is here. Download and place the directory in ~/data/celeba_tfrecords.


Running Experiments

python mgan.py [data] [model] -b [B] -m [M] -d [critic iterations] --gpus [gpus]

where B defines the steps from noise to data, M defines the steps from data to data, and [gpus] defines the CUDA_VISIBLE_DEVICES environment variable.

MNIST

python mgan.py mnist mlp -b 4 -m 3 -d 7 --gpus [gpus]

CelebA

Without shortcut connections:

python mgan.py celeba conv -b 4 -m 3 -d 7 --gpus [gpus]

With shortcut connections (will observe a much slower transition):

python mgan.py celeba conv_res -b 4 -m 3 -d 7 --gpus [gpus]

Custom Experiments

It is easy to define your own problem and run experiments.

  • Create a folder data under the models directory, and define data_sampler and noise_sampler in __init__.py.
  • Create a file model.py under the models/data directory, and define the following:
    • class TransitionFunction(TransitionBase) (Generator)
    • class Discriminator(DiscriminatorBase) (Discriminator)
    • def visualizer(model, name) (If you need to generate figures)
    • epoch_size and logging_freq
  • That's it!

Figures

Each row is from a single chain, where we sample for 50 time steps.

MNIST

MNIST MLP

CelebA

Without shortcut connections: CelebA 1-layer conv

With shortcut connections: CelebA 1-layer conv with shortcuts

Related Projects

a-nice-mc: adversarial training for efficient MCMC kernels, which is based on this project.

Citation

If you use this code for your research, please cite our paper:

@article{song2017generative,
  title={Generative Adversarial Training for Markov Chains},
  author={Song, Jiaming and Zhao, Shengjia and Ermon, Stefano},
  journal={ICLR 2017 (Workshop Track)},
  year={2017}
}

Contact

[email protected]

Code for the Pairwise Discriminator is not available at this moment; I will add that when I have the time.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].