All Projects → jmtomczak → intro_dgm

jmtomczak / intro_dgm

Licence: MIT License
An Introduction to Deep Generative Modeling: Examples

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to intro dgm

shared-latent-space
Shared Latent Space VAE's
Stars: ✭ 15 (-87.9%)
Mutual labels:  variational-autoencoder
keras-adversarial-autoencoders
Experiments with Adversarial Autoencoders using Keras
Stars: ✭ 20 (-83.87%)
Mutual labels:  variational-autoencoder
CHyVAE
Code for our paper -- Hyperprior Induced Unsupervised Disentanglement of Latent Representations (AAAI 2019)
Stars: ✭ 18 (-85.48%)
Mutual labels:  variational-autoencoder
multimodal-vae-public
A PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (-20.97%)
Mutual labels:  variational-autoencoder
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-85.48%)
Mutual labels:  variational-autoencoder
cocoon-demo
Cocoon – a flow-based workflow automation, data mining and visual analytics tool.
Stars: ✭ 19 (-84.68%)
Mutual labels:  flow-based-modeling
STEP
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (-68.55%)
Mutual labels:  variational-autoencoder
calc2.0
CALC2.0: Combining Appearance, Semantic and Geometric Information for Robust and Efficient Visual Loop Closure
Stars: ✭ 70 (-43.55%)
Mutual labels:  variational-autoencoder
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-83.87%)
Mutual labels:  variational-autoencoder
AutoEncoders
Variational autoencoder, denoising autoencoder and other variations of autoencoders implementation in keras
Stars: ✭ 14 (-88.71%)
Mutual labels:  variational-autoencoder
vaegan
An implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (-29.03%)
Mutual labels:  variational-autoencoder
vae-pytorch
AE and VAE Playground in PyTorch
Stars: ✭ 53 (-57.26%)
Mutual labels:  variational-autoencoder
Variational-NMT
Variational Neural Machine Translation System
Stars: ✭ 37 (-70.16%)
Mutual labels:  variational-autoencoder
vae-torch
Variational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (-69.35%)
Mutual labels:  variational-autoencoder
tt-vae-gan
Timbre transfer with variational autoencoding and cycle-consistent adversarial networks. Able to transfer the timbre of an audio source to that of another.
Stars: ✭ 37 (-70.16%)
Mutual labels:  variational-autoencoder
adVAE
Implementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (-86.29%)
Mutual labels:  variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-89.52%)
Mutual labels:  variational-autoencoder
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-78.23%)
Mutual labels:  variational-autoencoder
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-54.84%)
Mutual labels:  variational-autoencoder
lagvae
Lagrangian VAE
Stars: ✭ 27 (-78.23%)
Mutual labels:  variational-autoencoder

Introduction to Deep Generative Modeling: Examples

This repository contains examples of deep generative models:

  1. Autoregressive Models (ARMs)
  2. Flow-based models (flows): RealNVP and IDFs (Integer Discrete Flows)
  3. Variational Auto-Encoders (VAEs): a plain VAE and various priors, a hierarchical VAE
  4. Diffusion-based Deep Generative Models (DDGMs): a Gaussian forward diffusion
  5. Hybrid modeling
  6. Energy-based Models
  7. Generative Adversarial Networks (GANs)
  8. Neural Compression with Deep Generative Modeling

The examples might look oversimplistic but that's the point! My idea is that everyone is able to follow every line of the code, and run the experiments within a couple of minutes on almost any laptop or computer. My goal is to encourage people who are new to understand and play with deep generative models. More advanced users, on the other hand, could refresh their knowledge or build on top of that to quickly check their ideas. Either way, I hope the code will help everyone to join a fascinating journey on deep generative modeling!

Requirements

In all examples, we used:

  • pytorch 1.7.0
  • numpy 1.17.2
  • matplotlib 3.1.1
  • scikit-learn 0.21.3
  • pytorch-model-summary 0.1.1
  • jupyter 1.0.0

Examples

All examples of implemented deep generative models are provided as jupyter notebooks. They can be find in the following folders:

  1. arms: an example of an autoregressive model with a causal convolutiona layer in 1D.
  2. flows: an example of a flow-based model, namely, RealNVP with coupling layers and permutation layers, and IDFs (Integer Discrete Flows).
  3. vaes: an example of a Variational Auto-Encoder using fully-connected layers and a standard Gaussian prior, and another example of various priors for VAEs, and a third example on a hierarchical VAE.
  4. ddgms: an example of a Diffusion-based Deep Generative Model using the a Gaussian forward diffusion with a fixed variace and a reverse diffusion parameterized by MLPs.
  5. hybrid_modeling: an example of a hybrid model using fully-connected layers and IDFs.
  6. ebms: an example of an energy-based model parameterized by an MLP.
  7. gans: an example of a GAN parameterized by MLPs.
  8. neural_compression: an example of applying deep generative modeling for image neural compression.

Citation

If you use this code in any way, please refer to it in the following manner:

  • APA style:
Tomczak, J. M. (2021). Deep Generative Modeling.
  • Bibtex:
@article{tomczak2021intro,
  title={Deep Generative Modeling},
  author={Tomczak, Jakub M},
  year={2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].