All Projects → val-iisc → Deligan

val-iisc / Deligan

Licence: mit
This project is an implementation of the Generative Adversarial Network proposed in our CVPR 2017 paper - DeLiGAN : Generative Adversarial Networks for Diverse and Limited Data. DeLiGAN is a simple but effective modification of the GAN framework and aims to improve performance on datasets which are diverse yet small in size.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deligan

Bicyclegan
Toward Multimodal Image-to-Image Translation
Stars: ✭ 1,215 (+1079.61%)
Mutual labels:  generative-adversarial-network
Specgan
SpecGAN - generate audio with adversarial training
Stars: ✭ 92 (-10.68%)
Mutual labels:  generative-adversarial-network
3d Recgan Extended
🔥3D-RecGAN++ in Tensorflow (TPAMI 2018)
Stars: ✭ 98 (-4.85%)
Mutual labels:  generative-adversarial-network
Voice Conversion Gan
Voice Conversion using Cycle GAN's For Non-Parallel Data
Stars: ✭ 82 (-20.39%)
Mutual labels:  generative-adversarial-network
Deep Learning For Beginners
videos, lectures, blogs for Deep Learning
Stars: ✭ 89 (-13.59%)
Mutual labels:  generative-adversarial-network
Porousmediagan
Reconstruction of three-dimensional porous media using generative adversarial neural networks
Stars: ✭ 94 (-8.74%)
Mutual labels:  generative-adversarial-network
Markov Chain Gan
Code for "Generative Adversarial Training for Markov Chains" (ICLR 2017 Workshop)
Stars: ✭ 76 (-26.21%)
Mutual labels:  generative-adversarial-network
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (-2.91%)
Mutual labels:  generative-adversarial-network
Dmt
Disentangled Makeup Transfer with Generative Adversarial Network
Stars: ✭ 90 (-12.62%)
Mutual labels:  generative-adversarial-network
Lggan
[CVPR 2020] Local Class-Specific and Global Image-Level Generative Adversarial Networks for Semantic-Guided Scene Generation
Stars: ✭ 97 (-5.83%)
Mutual labels:  generative-adversarial-network
Tac Gan
A Tensorflow implementation of the Text Conditioned Auxiliary Classifier Generative Adversarial Network for Generating Images from text descriptions (https://arxiv.org/abs/1703.06412)
Stars: ✭ 82 (-20.39%)
Mutual labels:  generative-adversarial-network
Calogan
Generative Adversarial Networks for High Energy Physics extended to a multi-layer calorimeter simulation
Stars: ✭ 87 (-15.53%)
Mutual labels:  generative-adversarial-network
Doppelganger
[IMC 2020 (Best Paper Finalist)] Using GANs for Sharing Networked Time Series Data: Challenges, Initial Promise, and Open Questions
Stars: ✭ 97 (-5.83%)
Mutual labels:  generative-adversarial-network
Alice
NIPS 2017: ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching
Stars: ✭ 80 (-22.33%)
Mutual labels:  generative-adversarial-network
Chemgan Challenge
Code for the paper: Benhenda, M. 2017. ChemGAN challenge for drug discovery: can AI reproduce natural chemical diversity? arXiv preprint arXiv:1708.08227.
Stars: ✭ 98 (-4.85%)
Mutual labels:  generative-adversarial-network
Gazeanimation
Give a portrait face, move the gaze up
Stars: ✭ 77 (-25.24%)
Mutual labels:  generative-adversarial-network
Sprint gan
Privacy-preserving generative deep neural networks support clinical data sharing
Stars: ✭ 92 (-10.68%)
Mutual labels:  generative-adversarial-network
Gaal Based Outlier Detection
GAAL-based Outlier Detection
Stars: ✭ 102 (-0.97%)
Mutual labels:  generative-adversarial-network
Lsd Seg
Learning from Synthetic Data: Addressing Domain Shift for Semantic Segmentation
Stars: ✭ 99 (-3.88%)
Mutual labels:  generative-adversarial-network
Tagan
An official PyTorch implementation of the paper "Text-Adaptive Generative Adversarial Networks: Manipulating Images with Natural Language", NeurIPS 2018
Stars: ✭ 97 (-5.83%)
Mutual labels:  generative-adversarial-network

DeLiGAN

alt text

This project is an implementation of the Generative Adversarial Network proposed in our CVPR 2017 paper - DeLiGAN : Generative Adversarial Networks for Diverse and Limited Data. Via this project, we make two contributions:

  1. We propose a simple but effective modification of the GAN framework for settings where training data is diverse yet small in size.
  2. We propose a modification of inception-score proposed by Salimans et al. Our modified inception-score provides a single, unified measure of inter-class and intra-class variety in samples generated by a GAN.

Dependencies

The code for DeLiGAN is provided in Tensorflow 0.10 for the MNIST and Toy dataset, and in Theano 0.8.2 + Lasagne 0.2 for the CIFAR-10 and Sketches dataset. This code was tested on a Ubuntu 14.04 workstation hosting a NVIDIA Titan X GPU.

Datasets

This repository includes implementations for 4 different datasets.

  1. Toy (self generated unimodal and bimodal gaussians)
  2. MNIST (http://www.cs.toronto.edu/~gdahl/mnist.npz.gz)
  3. CIFAR-10 (https://www.cs.toronto.edu/~kriz/cifar.html)
  4. Sketches (http://cybertron.cg.tu-berlin.de/eitz/projects/classifysketch/)

The models for evaluating DeLiGAN on these datasets can be found in our repo. The details for how to download and lay out the datasets can be found in src/datasets/README.md

Usage

Training DeLiGAN models

To run any of the models

  • First download the datasets and store them in the respective sub-folder of the datasets folder (src/datasets/)
  • To run the model on any of the datasets, go to the respective src folders and run the dg_'dataset'.py file in the respective dataset folders with two arguments namely, --data_dir and --results_dir. For example, starting from the top-level folder,
cd src/sketches 
python dg_sketches.py --data_dir ../datasets/sketches/ --results_dir ../results/sketches
  • Note that the results_dir needs to have 'train' as a sub-folder.

Modified inception score

For example, to obtain the modified inception scores on CIFAR

  • Download the inception-v3 model (http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz.) and store it in src/modified_inception_scores/cifar10/
  • Generate samples using the model trained in the dg_cifar.py and copy it to src/modified_inception_scores/cifar10/
  • Run transfer_cifar10_softmax_b1.py to transfer learn the last layer.
  • Perform the modifications detailed in the comments in transfer_cifar10_softmax_b1.py and re-run it to evaluate the inception scores.
  • The provided code can be modified slightly to work for sketches as well by following the comments provided in transfer_cifar10_softmax_b1.py

Parts of the code in this implementation have been borrowed from the Improved-GAN implementation by OpenAI (T. Salimans, I. Goodfellow, W. Zaremba, V. Cheung, A. Radford, and X. Chen. Improved techniques for training gans. In Advances in Neural Information Processing Systems, pages 2226–2234, 2016.)

Cite

@inproceedings{DeLiGAN17,
  author = {Gurumurthy, Swaminathan and Sarvadevabhatla, Ravi Kiran and R. Venkatesh Babu},
  title = {DeLiGAN : Generative Adversarial Networks for Diverse and Limited Data},
  booktitle = {Proceedings of the 2017 Conference on Computer Vision and Pattern Recognition},
  location = {Honolulu, Hawaii, USA}
 }

Q&A

Please send message to [email protected] if you have any query regarding the code.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].