All Projects → robert-giaquinto → gradient-boosted-normalizing-flows

robert-giaquinto / gradient-boosted-normalizing-flows

Licence: MIT License
We got a stew going!

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to gradient-boosted-normalizing-flows

NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+215%)
Mutual labels:  density-estimation, normalizing-flows, deep-generative-model
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+1255%)
Mutual labels:  variational-inference, density-estimation, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+3935%)
Mutual labels:  variational-inference, variational-autoencoder
Kvae
Kalman Variational Auto-Encoder
Stars: ✭ 115 (+475%)
Mutual labels:  variational-inference, variational-autoencoder
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+750%)
Mutual labels:  density-estimation, variational-autoencoder
lagvae
Lagrangian VAE
Stars: ✭ 27 (+35%)
Mutual labels:  variational-inference, variational-autoencoder
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+1990%)
Mutual labels:  variational-inference, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+595%)
Mutual labels:  variational-inference, variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1280%)
Mutual labels:  variational-inference, variational-autoencoder
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+280%)
Mutual labels:  density-estimation, deep-generative-model
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+145%)
Mutual labels:  variational-inference, variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-35%)
Mutual labels:  variational-inference, variational-autoencoder
Normalizing Flows
Understanding normalizing flows
Stars: ✭ 126 (+530%)
Mutual labels:  variational-inference, variational-autoencoder
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+5955%)
Mutual labels:  normalizing-flows, variational-autoencoder
Gumbel-CRF
Implementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+155%)
Mutual labels:  density-estimation, deep-generative-model
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-10%)
Mutual labels:  variational-inference, variational-autoencoder
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+100%)
Mutual labels:  variational-inference
vaegan
An implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+340%)
Mutual labels:  variational-autoencoder
delfi
Density estimation likelihood-free inference. No longer actively developed see https://github.com/mackelab/sbi instead
Stars: ✭ 66 (+230%)
Mutual labels:  density-estimation
normalizing-flows
Implementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-25%)
Mutual labels:  normalizing-flows

Gradient Boosted Normalizing Flows

t arXiv

Introduction

The trend in normalizing flow (NF) literature has been to devise deeper, more complex transformations to achieve greater flexibility.

We propose an alternative: Gradient Boosted Normalizing Flows (GBNF) model a density by successively adding new NF components with gradient boosting. Under the boosting framework, each new NF component optimizes a sample weighted likelihood objective, resulting in new components that are fit to the residuals of the previously trained components.

The GBNF formulation results in a mixture model structure, whose flexibility increases as more components are added. Moreover, GBNFs offer a wider, as opposed to strictly deeper, approach that improves existing NFs at the cost of additional training---not more complex transformations.

Link to paper:

Gradient Boosted Normalizing Flows by Robert Giaquinto and Arindam Banerjee. In Advances in Neural Information Processing Systems (NeurIPS), 2020.

Requirements

The code is compatible with:

  • pytorch 1.1.0
  • python 3.6+ (should work fine with python 2.7 though if you include print_function)

It is recommended that you create a virtual environment with the correct python version and dependencies. After cloning the repository, change directories and run the following codes to create a virtual environment:

python -m venv ./venv
source ./venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt

(code assumes python refers to python 3.6+, if not use python3)

Data

The experiments can be run on the following images datasets:

  • static MNIST: dataset is in data folder;
  • OMNIGLOT: the dataset can be downloaded from link;
  • Caltech 101 Silhouettes: the dataset can be downloaded from link.
  • Frey Faces: the dataset can be downloaded from link.
  • CIFAR10: from Torchvision library
  • CelebA: from Torchvision library

Additionally, density estimation experiments can be run on datasets from the UCI repository, which can be downloaded by:

./download_datasets.sh

Project Structure

  • main_experiment.py: Run experiments for generative modeling with variational autoencoders on image datasets.
  • density_experiment.py: Run experiments for density estimation on real datasets.
  • toy_experiment.py: Run experiments for the toy datasets for density estimation and matching.
  • image_experiment.py: Run experiments for image modeling with only flows (no VAE).
  • models: Collection of models implemented in experiments
  • optimization: Training, evaluation, and loss functions used in main experiment.
  • scripts: Bash scripts for running experiments, along with default configurations used in experiments.
  • utils: Utility functions, plotting, and data preparation.
  • data: Folder containing raw data.

Getting Started

The scripts folder includes examples for running the GBNF model on the Caltech 101 Silhouettes dataset and a density estimation experiment.

Toy problem: match 2-moons energy function with Boosted Real-NVPs

./scripts/getting_started_toy_matching_gbnf.sh &

Toy problem: density estimation on the 8-Gaussians with Boosted Real-NVPs

./scripts/getting_started_toy_estimation_gbnf.sh &

Density estimation of MINIBOONE dataset with Boosted Glow

./scripts/getting_started_density_estimation_gbnf.sh &

Generative modeling of Caltech 101 Silhouettes images with Boosted Real-NVPs

./scripts/getting_started_vae_gbnf.sh &


More information about additional argument options can be found by running ```python main_experiment.py -h```
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].