All Projects → shayneobrien → Generative Models

shayneobrien / Generative Models

Licence: mit
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Generative Models

Generative adversarial networks 101
Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (-68.49%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network, wgan
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-69.41%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network, vae
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+764.16%)
Mutual labels:  gan, vae, wgan
Image generator
DCGAN image generator 🖼️.
Stars: ✭ 173 (-60.5%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Gan steerability
On the "steerability" of generative adversarial networks
Stars: ✭ 225 (-48.63%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Sdv
Synthetic Data Generation for tabular, relational and time series data.
Stars: ✭ 360 (-17.81%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Pytorch Gan
A minimal implementaion (less than 150 lines of code with visualization) of DCGAN/WGAN in PyTorch with jupyter notebooks
Stars: ✭ 150 (-65.75%)
Mutual labels:  jupyter-notebook, gan, wgan
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (-10.05%)
Mutual labels:  gan, generative-adversarial-network, vae
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (-77.17%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Tensorflow Tutorial
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+841.1%)
Mutual labels:  gan, generative-adversarial-network, autoencoder
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+625.57%)
Mutual labels:  gan, generative-adversarial-network, autoencoder
Faceswap Gan
A denoising autoencoder + adversarial losses and attention mechanisms for face swapping.
Stars: ✭ 3,099 (+607.53%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Capsule Gan
Code for my Master thesis on "Capsule Architecture as a Discriminator in Generative Adversarial Networks".
Stars: ✭ 120 (-72.6%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Stylegan2 Projecting Images
Projecting images to latent space with StyleGAN2.
Stars: ✭ 102 (-76.71%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Deep Learning Resources
由淺入深的深度學習資源 Collection of deep learning materials for everyone
Stars: ✭ 422 (-3.65%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (-56.85%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Calogan
Generative Adversarial Networks for High Energy Physics extended to a multi-layer calorimeter simulation
Stars: ✭ 87 (-80.14%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Sprint gan
Privacy-preserving generative deep neural networks support clinical data sharing
Stars: ✭ 92 (-79%)
Mutual labels:  jupyter-notebook, gan, generative-adversarial-network
Gan Tutorial
Simple Implementation of many GAN models with PyTorch.
Stars: ✭ 227 (-48.17%)
Mutual labels:  jupyter-notebook, gan, wgan
Zhihu
This repo contains the source code in my personal column (https://zhuanlan.zhihu.com/zhaoyeyu), implemented using Python 3.6. Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code.
Stars: ✭ 3,307 (+655.02%)
Mutual labels:  jupyter-notebook, gan, autoencoder

Overview

PyTorch 0.4.1 | Python 3.6.5

Annotated implementations with comparative introductions for minimax, non-saturating, wasserstein, wasserstein gradient penalty, least squares, deep regret analytic, bounded equilibrium, relativistic, f-divergence, Fisher, and information generative adversarial networks (GANs), and standard, variational, and bounded information rate variational autoencoders (VAEs).

Paper links are supplied at the beginning of each file with a short summary of the paper. See src folder for files to run via terminal, or notebooks folder for Jupyter notebook visualizations via your local browser. The main file changes can be see in the train, train_D, and train_G of the Trainer class, although changes are not completely limited to only these two areas (e.g. Wasserstein GAN clamps weight in the train function, BEGAN gives multiple outputs from train_D, fGAN has a slight modification in viz_loss function to indicate method used in title).

All code in this repository operates in a generative, unsupervised manner on binary (black and white) MNIST. The architectures are compatible with a variety of datatypes (1D, 2D, square 3D images). Plotting functions work with binary/RGB images. If a GPU is detected, the models use it. Otherwise, they default to CPU. VAE Trainer classes contain methods to visualize latent space representations (see make_all function).

Usage

To initialize an environment:

python -m venv env  
. env/bin/activate  
pip install -r requirements.txt  

For playing around in Jupyer notebooks:

jupyter notebook

To run from Terminal:

cd src
python bir_vae.py

New Models

One of the primary purposes of this repository is to make implementing deep generative model (i.e., GAN/VAE) variants as easy as possible. This is possible because, typically but not always (e.g. BIRVAE), the proposed modifications only apply to the way loss is computed for backpropagation. Thus, the core training class is structured in such a way that most new implementations should only require edits to the train_D and train_G functions of GAN Trainer classes, and the compute_batch function of VAE Trainer classes.

Suppose we have a non-saturating GAN and we wanted to implement a least-squares GAN. To do this, all we have to do is change two lines:

Original (NSGAN)

def train_D(self, images):
  ...
  D_loss = -torch.mean(torch.log(DX_score + 1e-8) + torch.log(1 - DG_score + 1e-8))

  return D_loss
def train_G(self, images):
  ...
  G_loss = -torch.mean(torch.log(DG_score + 1e-8))

  return G_loss

New (LSGAN)

def train_D(self, images):
  ...
  D_loss = (0.50 * torch.mean((DX_score - 1.)**2)) + (0.50 * torch.mean((DG_score - 0.)**2))

  return D_loss
def train_G(self, images):
  ...
  G_loss = 0.50 * torch.mean((DG_score - 1.)**2)

  return G_loss

Model Architecture

The architecture chosen in these implementations for both the generator (G) and discriminator (D) consists of a simple, two-layer feedforward network. While this will give sensible output for MNIST, in practice it is recommended to use deep convolutional architectures (i.e. DCGANs) to get nicer outputs. This can be done by editing the Generator and Discriminator classes for GANs, or the Encoder and Decoder classes for VAEs.

Visualization

All models were trained for 25 epochs with hidden dimension 400, latent dimension 20. Other implementation specifics are as close to the respective original paper (linked) as possible.

Model Epoch 1 Epoch 25 Progress Loss
MMGAN
NSGAN
WGAN
WGPGAN
DRAGAN
BEGAN
LSGAN
RaNSGAN
FisherGAN
InfoGAN
f-TVGAN
f-PearsonGAN
f-JSGAN
f-ForwGAN
f-RevGAN
f-HellingerGAN
VAE
BIRVAE

To Do

Models: CVAE, denoising VAE, adversarial autoencoder | Bayesian GAN, Self-attention GAN, Primal-Dual Wasserstein GAN
Architectures: Add DCGAN option
Datasets: Beyond MNIST

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].