All Projects → Natsu6767 → Infogan Pytorch

Natsu6767 / Infogan Pytorch

PyTorch Implementation of InfoGAN

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Infogan Pytorch

gan-error-avoidance
Learning to Avoid Errors in GANs by Input Space Manipulation (Code for paper)
Stars: ✭ 23 (-88.32%)
Mutual labels:  celeba
Pytorch Mnist Celeba Gan Dcgan
Pytorch implementation of Generative Adversarial Networks (GAN) and Deep Convolutional Generative Adversarial Networks (DCGAN) for MNIST and CelebA datasets
Stars: ✭ 363 (+84.26%)
Mutual labels:  celeba
Interpretability By Parts
Code repository for "Interpretable and Accurate Fine-grained Recognition via Region Grouping", CVPR 2020 (Oral)
Stars: ✭ 88 (-55.33%)
Mutual labels:  celeba
Tensorflow DCGAN
Study Friendly Implementation of DCGAN in Tensorflow
Stars: ✭ 22 (-88.83%)
Mutual labels:  celeba
Beta Vae
Pytorch implementation of β-VAE
Stars: ✭ 326 (+65.48%)
Mutual labels:  celeba
Tf.gans Comparison
Implementations of (theoretical) generative adversarial networks and comparison without cherry-picking
Stars: ✭ 477 (+142.13%)
Mutual labels:  celeba
DCGAN-Pytorch
A Pytorch implementation of "Deep Convolutional Generative Adversarial Networks"
Stars: ✭ 23 (-88.32%)
Mutual labels:  celeba
Hd Celeba Cropper
To obtain high resolution face images from CelebA
Stars: ✭ 126 (-36.04%)
Mutual labels:  celeba
Pycadl
Python package with source code from the course "Creative Applications of Deep Learning w/ TensorFlow"
Stars: ✭ 356 (+80.71%)
Mutual labels:  celeba
Celeba Hq Modified
Modified h5tool.py make user getting celeba-HQ easier
Stars: ✭ 84 (-57.36%)
Mutual labels:  celeba
DCGAN-CelebA-PyTorch-CPP
DCGAN Implementation using PyTorch in both C++ and Python
Stars: ✭ 14 (-92.89%)
Mutual labels:  celeba
Pytorch Mnist Celeba Cgan Cdcgan
Pytorch implementation of conditional Generative Adversarial Networks (cGAN) and conditional Deep Convolutional Generative Adversarial Networks (cDCGAN) for MNIST dataset
Stars: ✭ 290 (+47.21%)
Mutual labels:  celeba
Began Tensorflow
Tensorflow implementation of "BEGAN: Boundary Equilibrium Generative Adversarial Networks"
Stars: ✭ 904 (+358.88%)
Mutual labels:  celeba
style-vae
Implementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-87.31%)
Mutual labels:  celeba
Tf Exercise Gan
Tensorflow implementation of different GANs and their comparisions
Stars: ✭ 110 (-44.16%)
Mutual labels:  celeba
cDCGAN
PyTorch implementation of Conditional Deep Convolutional Generative Adversarial Networks (cDCGAN)
Stars: ✭ 49 (-75.13%)
Mutual labels:  celeba
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+102.03%)
Mutual labels:  celeba
Factorvae
Pytorch implementation of FactorVAE proposed in Disentangling by Factorising(http://arxiv.org/abs/1802.05983)
Stars: ✭ 176 (-10.66%)
Mutual labels:  celeba
Msg Gan V1
MSG-GAN: Multi-Scale Gradients GAN (Architecture inspired from ProGAN but doesn't use layer-wise growing)
Stars: ✭ 116 (-41.12%)
Mutual labels:  celeba
Celebamask Hq
A large-scale face dataset for face parsing, recognition, generation and editing.
Stars: ✭ 1,156 (+486.8%)
Mutual labels:  celeba

InfoGAN-PyTorch

PyTorch implementation of InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets with result of experiments on MNIST, FashionMNIST, SVHN and CelebA datasets.

Introduction

InfoGAN is an information-theoretic extension to the simple Generative Adversarial Networks that is able to learn disentangled representations in a completely unsupervised manner. What this means is that InfoGAN successfully disentangle wrirting styles from digit shapes on th MNIST dataset and discover visual concepts such as hair styles and gender on the CelebA dataset. To achieve this an information-theoretic regularization is added to the loss function that enforces the maximization of mutual information between latent codes, c, and the generator distribution G(z, c).

Folder structure

The following shows basic folder structure.

├── train.py # train script
├── data
│   ├── mnist # mnist data (not included in this repo)
│   ├── ...
│   ├── ...
│   └── fashion-mnist # fashion-mnist data (not included in this repo)
│
├── config.py # hyperparameters for training
├── utils.py # utils
├── dataloader.py # dataloader
├── models # infoGAN networks for different datasets
│   ├── mnist_model.py
│   ├── svhn_model.py
│   └── celeba_model.py
└── results # generation results to be saved here

Development Environment

  • Ubuntu 16.04 LTS
  • NVIDIA GeForce GTX 1060
  • cuda 9.0
  • Python 3.6.5
  • PyTorch 1.0.0
  • torchvision 0.2.1
  • numpy 1.14.3
  • matplotlib 2.2.2

Usage

Edit the config.py file to select training parameters and the dataset to use. Choose dataset from ['MNIST', 'FashionMNIST', 'SVHN', 'CelebA']

To train the model run train.py:

python3 train.py

After training the network to experiment with the latent code for the MNIST dataset run mnist_generate.py:

python3 mnist_generate.py --load_path /path/to/pth/checkpoint

Results

MNIST

Training Data Generation GIF
Epoch 1 Epoch 50 Epoch 100

Training Loss Curve:

Manipulating Latent Code

Rotation of digits.
Row represents categorical variable from K = 0 to K = 9 (top to buttom) to characterize digits. Column represents continuous variable varying from -2 to 2 (left to right).

Variation in Width
Row represents categorical variable from K = 0 to K = 9 (top to buttom) to characterize digits. Column represents continuous variable varying from -2 to 2 (left to right).

FashionMNIST

Training Data Generation GIF
Epoch 1 Epoch 50 Epoch 100

Training Loss Curve:

Manipulating Latent Code

Thickness of items.
Row represents categorical variable from K = 0 to K = 9 (top to buttom) to characterize items. Column represents continuous variable varying from -2 to 2 (left to right).

SVHN

Training Data Generation GIF
Epoch 1 Epoch 50 Epoch 100

Training Loss Curve:

Manipulating Latent Code

Continuous Variation: Lighting

Discrete Variation: Plate Context

CelebA

Training Data Generation GIF
Epoch 1 Epoch 50 Epoch 100

Training Loss Curve:

Manipulating Latent Code

Azimuth (pose)

Gender: Roughly ordered from male to female (left to right)

Emotion

Hair Style and Color

Hair Quantity: Roughly ordered from less hair to more hair (left to right)

References

  1. Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel. InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets. [arxiv]
  2. pianomania/infoGAN-pytorch [repo]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].