All Projects → ChunyuanLI → Mnist_inception_score

ChunyuanLI / Mnist_inception_score

Training a MNIST classifier, and use it to compute inception score (ICP)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Mnist inception score

pytorch-GAN
My pytorch implementation for GAN
Stars: ✭ 12 (-52%)
Mutual labels:  generative-adversarial-network, generative-model
pytorch-CycleGAN
Pytorch implementation of CycleGAN.
Stars: ✭ 39 (+56%)
Mutual labels:  generative-adversarial-network, generative-model
simplegan
Tensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-24%)
Mutual labels:  generative-adversarial-network, generative-model
Sgan
Stacked Generative Adversarial Networks
Stars: ✭ 240 (+860%)
Mutual labels:  generative-adversarial-network, generative-model
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1004%)
Mutual labels:  generative-adversarial-network, generative-model
MMD-GAN
Improving MMD-GAN training with repulsive loss function
Stars: ✭ 82 (+228%)
Mutual labels:  generative-adversarial-network, generative-model
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-52%)
Mutual labels:  generative-adversarial-network, generative-model
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+656%)
Mutual labels:  generative-adversarial-network, generative-model
celeba-gan-pytorch
Generative Adversarial Networks in PyTorch
Stars: ✭ 35 (+40%)
Mutual labels:  generative-adversarial-network, generative-model
TriangleGAN
TriangleGAN, ACM MM 2019.
Stars: ✭ 28 (+12%)
Mutual labels:  generative-adversarial-network, generative-model
Wgan
Tensorflow Implementation of Wasserstein GAN (and Improved version in wgan_v2)
Stars: ✭ 228 (+812%)
Mutual labels:  generative-adversarial-network, generative-model
Seqgan
A simplified PyTorch implementation of "SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient." (Yu, Lantao, et al.)
Stars: ✭ 502 (+1908%)
Mutual labels:  generative-adversarial-network, generative-model
Triple Gan
See Triple-GAN-V2 in PyTorch: https://github.com/taufikxu/Triple-GAN
Stars: ✭ 203 (+712%)
Mutual labels:  generative-adversarial-network, generative-model
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+1008%)
Mutual labels:  generative-adversarial-network, generative-model
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+696%)
Mutual labels:  generative-adversarial-network, generative-model
GraphCNN-GAN
Graph-convolutional GAN for point cloud generation. Code from ICLR 2019 paper Learning Localized Generative Models for 3D Point Clouds via Graph Convolution
Stars: ✭ 50 (+100%)
Mutual labels:  generative-adversarial-network, generative-model
Conditional Gan
Anime Generation
Stars: ✭ 141 (+464%)
Mutual labels:  generative-adversarial-network, generative-model
Stylegan2 Pytorch
Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement
Stars: ✭ 2,656 (+10524%)
Mutual labels:  generative-adversarial-network, generative-model
py-msa-kdenlive
Python script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (+0%)
Mutual labels:  generative-adversarial-network, generative-model
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+12612%)
Mutual labels:  generative-adversarial-network, generative-model

Inception Score for MNIST

Train a "perfect" MNIST classifier, and use it to compute inception score (ICP)

With our ICP implementation (pre-trained model saved in directory 'model'), the testing set of MNIST yields a score

Note that different pre-trained models may lead to slightly different inception scores.

Prerequisites: Tensorflow 1.0


The Format of Generated Images

The generated images are saved in a mat file, with a tensor named 'images' of size [10000,784], where 10000 is the number of images, and 784 is the dimension of a flattened MNIST image.

If you have multiple checkout points (each is a mat file) saved in a folder, you may specify the directory as

    # folders for generated images
    result_folder = './example_dir/'

    icp = []
    for k in range(50):
        k = k + 1
        mat = scipy.io.loadmat(result_folder+ '{}.mat'.format(str(k).zfill(3)))

If you have one checkout point saved in a mat file, you may specify the file as

    file_name = 'example.mat'
    mat = scipy.io.loadmat(result_folder+ file_name )

How to Use the Code: Evaluation, Re-train and Plot

To evaluate the ICP of generated images, run:

mnist_cnn_icp_eval.py

If you would like to re-train your classifier model, run:

mnist_cnn_train_slim.py

If you would like to plot your inception scores for multiple checkout points, run:

mnist_icp_plot.py

Citation

This code is used in the following paper:

@article{li2017alice,
  title={ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching},
  author={Li, Chunyuan and Liu, Hao and Chen, Changyou and Pu, Yunchen and Chen, Liqun and Henao, Ricardo and Carin, Lawrence},
  journal={Neural Information Processing Systems (NIPS)},
  year={2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].