All Projects → joeylitalien → celeba-gan-pytorch

joeylitalien / celeba-gan-pytorch

Licence: other
Generative Adversarial Networks in PyTorch

Programming Languages

python
139335 projects - #7 most used programming language
TeX
3793 projects
shell
77523 projects

Projects that are alternatives of or similar to celeba-gan-pytorch

Triple Gan
See Triple-GAN-V2 in PyTorch: https://github.com/taufikxu/Triple-GAN
Stars: ✭ 203 (+480%)
Mutual labels:  generative-adversarial-network, generative-model
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+691.43%)
Mutual labels:  generative-adversarial-network, generative-model
Wgan
Tensorflow Implementation of Wasserstein GAN (and Improved version in wgan_v2)
Stars: ✭ 228 (+551.43%)
Mutual labels:  generative-adversarial-network, generative-model
Stylegan2 Pytorch
Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement
Stars: ✭ 2,656 (+7488.57%)
Mutual labels:  generative-adversarial-network, generative-model
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-65.71%)
Mutual labels:  generative-adversarial-network, generative-model
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+440%)
Mutual labels:  generative-adversarial-network, generative-model
MMD-GAN
Improving MMD-GAN training with repulsive loss function
Stars: ✭ 82 (+134.29%)
Mutual labels:  generative-adversarial-network, generative-model
Cramer Gan
Tensorflow Implementation on "The Cramer Distance as a Solution to Biased Wasserstein Gradients" (https://arxiv.org/pdf/1705.10743.pdf)
Stars: ✭ 123 (+251.43%)
Mutual labels:  generative-adversarial-network, generative-model
GraphCNN-GAN
Graph-convolutional GAN for point cloud generation. Code from ICLR 2019 paper Learning Localized Generative Models for 3D Point Clouds via Graph Convolution
Stars: ✭ 50 (+42.86%)
Mutual labels:  generative-adversarial-network, generative-model
simplegan
Tensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-45.71%)
Mutual labels:  generative-adversarial-network, generative-model
Conditional Gan
Anime Generation
Stars: ✭ 141 (+302.86%)
Mutual labels:  generative-adversarial-network, generative-model
py-msa-kdenlive
Python script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (-28.57%)
Mutual labels:  generative-adversarial-network, generative-model
Semantic image inpainting
Semantic Image Inpainting
Stars: ✭ 140 (+300%)
Mutual labels:  generative-adversarial-network, generative-model
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+468.57%)
Mutual labels:  generative-adversarial-network, generative-model
Gesturegan
[ACM MM 2018 Oral] GestureGAN for Hand Gesture-to-Gesture Translation in the Wild
Stars: ✭ 136 (+288.57%)
Mutual labels:  generative-adversarial-network, generative-model
Sgan
Stacked Generative Adversarial Networks
Stars: ✭ 240 (+585.71%)
Mutual labels:  generative-adversarial-network, generative-model
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (+185.71%)
Mutual labels:  generative-adversarial-network, generative-model
Generative Evaluation Prdc
Code base for the precision, recall, density, and coverage metrics for generative models. ICML 2020.
Stars: ✭ 117 (+234.29%)
Mutual labels:  generative-adversarial-network, generative-model
pytorch-GAN
My pytorch implementation for GAN
Stars: ✭ 12 (-65.71%)
Mutual labels:  generative-adversarial-network, generative-model
pytorch-CycleGAN
Pytorch implementation of CycleGAN.
Stars: ✭ 39 (+11.43%)
Mutual labels:  generative-adversarial-network, generative-model

CelebA GANs in PyTorch

IFT6135 Representation Learning (UdeM, A. Courville) — Assignment 4

Dependencies

Tested on Python 3.6.x.

CelebA dataset

The full CelebA is available here. To resize the RGB images to 64 by 64 pixels, run CelebA_helper.py.

Training

To train a model, simply specify the model type (gan, wgan or lsgan) with the appropriate hyperparameters. In case these parameters are not specified, the program reverts back to default training parameters from the original papers.

./train.py --type wgan \
           --nb-epochs 50 \
           --batch-size 64 \
           --learning-rate 0.00005 \
           --optimizer rmsprop \
           --critic 5 \
           --ckpt ./../checkpoints/trained_wgan \
           --cuda

This assumes that the training images are in ./../data/celebA_all. To train using a smaller dataset (e.g. 12800 images), create a new folder called ./../data/celebA_redux and train using the --redux flag.

To create GIF/MP4 videos like below, run src/checkpoints/make_anim.sh trained_* after training. This will annotate each epoch using Imagemagick and combine them into a single video using FFmpeg.

GAN WGAN LSGAN

Notice how the LSGAN suffers from total mode collapse at epoch 45.

Latent space exploration

To explore the face manifold in latent space, run

./lerp.py --pretrained ./checkpoints/trained_gan/dcgan-gen.pt \
          --dir ./out \
          --latent-play 140 \
          --cuda

This will use RNG seed 140 to first generate a random tensor of size 100. Then, each dimension will be clamped to ± 3 and saved to a new image./out/dim*.png. The result is 100 different images that only differ by one dimension from the original image. These images can then be analyzed to figure out which dimension control different generative features (e.g. open/close mouth, hair color, gender, nose shape, etc.).

GAN WGAN
latentexplore-gan latentexplore-gan

Latent space interpolation

To perform linear interpolation in latent space, run

./lerp.py --pretrained ./checkpoints/trained_gan/dcgan-gen.pt \
          --dir ./out \
          --latent 140 180 \
          --nb-frames 50 \
          --video \
          --cuda

This will linearly interpolate between two random tensors generated from seeds 140 and 180 and create a GIF/MP4 videos of the sequence. The frames and videos will be stored in ./out.

GAN WGAN

Screen space interpolation

To perform linear interpolation in screen space, run

./lerp.py --pretrained ./checkpoints/trained_gan/dcgan-gen.pt \
          --dir ./out \
          --screen 140 180 \
          --nb-frames 50 \
          --video \
          --cuda

This will linearly interpolate between two random images generated from seeds 140 and 180 and create a GIF/MP4 videos of the sequence.

GAN WGAN

Inception and Mode score

We reuse the code from Shane Barratt to quantitatively measure our models' performance. Calculating the scores using 4096 samples gives the bar graph below.

Authors

References

Arjovsky et al. Wasserstein Generative Adversarial Networks. In Proceedings of the 34th International Conference on Machine Learning, ICML 2017.

Goodfellow et al. Generative Adversarial Nets. In Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, 2014.

Mao et al. Multi-class Generative Adversarial Networks with the L2 Loss Function. arXiv, abs/1611.04076, 2016.

Radford et al. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks. arXiv, abs/1511.06434, 2015.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].