All Projects → VincentStimper → normalizing-flows

VincentStimper / normalizing-flows

Licence: MIT license
PyTorch implementation of normalizing flow models

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to normalizing-flows

gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-92.62%)
Mutual labels:  variational-inference, density-estimation, variational-autoencoder
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-93.36%)
Mutual labels:  variational-inference, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-48.71%)
Mutual labels:  variational-inference, variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-95.2%)
Mutual labels:  variational-inference, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+197.79%)
Mutual labels:  variational-inference, variational-autoencoder
lagvae
Lagrangian VAE
Stars: ✭ 27 (-90.04%)
Mutual labels:  variational-inference, variational-autoencoder
Normalizing Flows
Understanding normalizing flows
Stars: ✭ 126 (-53.51%)
Mutual labels:  variational-inference, variational-autoencoder
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-81.92%)
Mutual labels:  variational-inference, variational-autoencoder
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+54.24%)
Mutual labels:  variational-inference, variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1.85%)
Mutual labels:  variational-inference, variational-autoencoder
Kvae
Kalman Variational Auto-Encoder
Stars: ✭ 115 (-57.56%)
Mutual labels:  variational-inference, variational-autoencoder
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (-37.27%)
Mutual labels:  density-estimation, variational-autoencoder
Rethinking Tensorflow Probability
Statistical Rethinking (2nd Ed) with Tensorflow Probability
Stars: ✭ 152 (-43.91%)
Mutual labels:  variational-inference
boundary-gp
Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Stars: ✭ 21 (-92.25%)
Mutual labels:  variational-inference
Variational Inference With Normalizing Flows
Reimplementation of Variational Inference with Normalizing Flows (https://arxiv.org/abs/1505.05770)
Stars: ✭ 146 (-46.13%)
Mutual labels:  variational-inference
VAE-Latent-Space-Explorer
Interactive exploration of MNIST variational autoencoder latent space with React and tensorflow.js.
Stars: ✭ 30 (-88.93%)
Mutual labels:  variational-autoencoder
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (-79.34%)
Mutual labels:  variational-autoencoder
Celeste.jl
Scalable inference for a generative model of astronomical images
Stars: ✭ 142 (-47.6%)
Mutual labels:  variational-inference
Vbmc
Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (-54.61%)
Mutual labels:  variational-inference
svae cf
[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (-85.98%)
Mutual labels:  variational-autoencoder

normflows: A PyTorch Package for Normalizing Flows

documentation unit-tests code coverage Code Style: Black License: MIT PyPI Downloads

This is a PyTorch implementation of normalizing flows. Many popular flow architectures are implemented, see the list below. The package can be easily installed via pip. The basic usage is described here, and a full documentation is available as well. There are several sample use cases implemented in the examples folder, including Glow, a VAE, and a Residual Flow.

Implemented Flows

Architecture Reference
Planar Flow Rezende & Mohamed, 2015
Radial Flow Rezende & Mohamed, 2015
NICE Dinh et al., 2014
Real NVP Dinh et al., 2017
Glow Kingma et al., 2018
Masked Autoregressive Flow Papamakarios et al., 2017
Neural Spline Flow Durkan et al., 2019
Circular Neural Spline Flow Rezende et al., 2020
Residual Flow Chen et al., 2019
Stochastic Normalizing Flow Wu et al., 2020

Note that Neural Spline Flows with circular and non-circular coordinates are supported as well.

Installation

The latest version of the package can be installed via pip

pip install normflows

At least Python 3.7 is required. If you want to use a GPU, make sure that PyTorch is set up correctly by following the instructions at the PyTorch website.

To run the example notebooks clone the repository first

git clone https://github.com/VincentStimper/normalizing-flows.git

and then install the dependencies.

pip install -r requirements_examples.txt

Usage

Open In Colab

A normalizing flow consists of a base distribution, defined in nf.distributions.base, and a list of flows, given in nf.flows. Let's assume our target is a 2D distribution. We pick a diagonal Gaussian base distribution, which is the most popular choice. Our flow shall be a Real NVP model and, therefore, we need to define a neural network for computing the parameters of the affine coupling map. One dimension is used to compute the scale and shift parameter for the other dimension. After each coupling layer we swap their roles.

import normflows as nf

# Define 2D Gaussian base distribution
base = nf.distributions.base.DiagGaussian(2)

# Define list of flows
num_layers = 32
flows = []
for i in range(num_layers):
    # Neural network with two hidden layers having 64 units each
    # Last layer is initialized by zeros making training more stable
    param_map = nf.nets.MLP([1, 64, 64, 2], init_zeros=True)
    # Add flow layer
    flows.append(nf.flows.AffineCouplingBlock(param_map))
    # Swap dimensions
    flows.append(nf.flows.Permute(2, mode='swap'))

Once they are set up, we can define a nf.NormalizingFlow model. If the target density is available, it can be added to the model to be used during training. Sample target distributions are given in nf.distributions.target.

# If the target density is not given
model = nf.NormalizingFlow(base, flows)

# If the target density is given
target = nf.distributions.target.TwoMoons()
model = nf.NormalizingFlow(base, flows, target)

The loss can be computed with the methods of the model and minimized.

# When doing maximum likelihood learning, i.e. minimizing the forward KLD
# with no target distribution given
loss = model.forward_kld(x)

# When minimizing the reverse KLD based on the given target distribution
loss = model.reverse_kld(num_samples=512)

# Optimization as usual
loss.backward()
optimizer.step()

As more extensive version of this example is given as a notebook, which can directly be opened in Colab. There, we apply a Real NVP model to a bimodal target distribution and obtain the following results.

2D target distribution and Real NVP model

In another example, we apply a Neural Spline Flow model to a distribution defined on a cylinder. The resulting density is visualized below.

Neural Spline Flow applied to target distribution on a cylinder

For more illustrative examples of how to use the package see the examples directory. More advanced experiments can be done with the scripts listed in the repository about resampled base distributions, see its experiments folder.

Used by

The package has been used in several research papers, which are listed below.

Andrew Campbell, Wenlong Chen, Vincent Stimper, José Miguel Hernández-Lobato, and Yichuan Zhang. A gradient based strategy for Hamiltonian Monte Carlo hyperparameter optimization. In Proceedings of the 38th International Conference on Machine Learning, pp. 1238–1248. PMLR, 2021.

Code available on GitHub.

Vincent Stimper, Bernhard Schölkopf, José Miguel Hernández-Lobato. Resampling Base Distributions of Normalizing Flows. In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, volume 151, pp. 4915–4936, 2022.

Code available on GitHub.

Laurence I. Midgley, Vincent Stimper, Gregor N. C. Simm, Bernhard Schölkopf, José Miguel Hernández-Lobato. Flow Annealed Importance Sampling Bootstrap. ArXiv, abs/2208.01893, 2022.

Code available on GitHub.

Moreover, the boltzgen package has been build upon normflows.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].