All Projects → slimgroup → InvertibleNetworks.jl

slimgroup / InvertibleNetworks.jl

Licence: MIT license
A Julia framework for invertible neural networks

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to InvertibleNetworks.jl

continuous-time-flow-process
PyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (-60.47%)
Mutual labels:  normalizing-flows
MongeAmpereFlow
Continuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (-66.28%)
Mutual labels:  normalizing-flows
deeprob-kit
A Python Library for Deep Probabilistic Modeling
Stars: ✭ 32 (-62.79%)
Mutual labels:  normalizing-flows
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+215.12%)
Mutual labels:  invertible-neural-networks
NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (-26.74%)
Mutual labels:  normalizing-flows
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+1308.14%)
Mutual labels:  normalizing-flows
flowtorch-old
Separating Normalizing Flows code from Pyro and improving API
Stars: ✭ 36 (-58.14%)
Mutual labels:  normalizing-flows
nessai
nessai: Nested Sampling with Artificial Intelligence
Stars: ✭ 18 (-79.07%)
Mutual labels:  normalizing-flows
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-43.02%)
Mutual labels:  normalizing-flows
UMNN
Implementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
Stars: ✭ 63 (-26.74%)
Mutual labels:  normalizing-flows
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-76.74%)
Mutual labels:  normalizing-flows
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-73.26%)
Mutual labels:  normalizing-flows
normalizing-flows
Implementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-82.56%)
Mutual labels:  normalizing-flows
introduction to normalizing flows
Jupyter Notebook corresponding to 'Going with the Flow: An Introduction to Normalizing Flows'
Stars: ✭ 21 (-75.58%)
Mutual labels:  normalizing-flows
cflow-ad
Official PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+60.47%)
Mutual labels:  normalizing-flows
Normalizing Flows
Implementation of Normalizing flows on MNIST https://arxiv.org/abs/1505.05770
Stars: ✭ 14 (-83.72%)
Mutual labels:  normalizing-flows
ifl-tpp
Implementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (-32.56%)
Mutual labels:  normalizing-flows
constant-memory-waveglow
PyTorch implementation of NVIDIA WaveGlow with constant memory cost.
Stars: ✭ 36 (-58.14%)
Mutual labels:  normalizing-flows
graph-nvp
GraphNVP: An Invertible Flow Model for Generating Molecular Graphs
Stars: ✭ 69 (-19.77%)
Mutual labels:  invertible-neural-networks

InvertibleNetworks.jl

Documentation Build Status
CI DOI

Building blocks for invertible neural networks in the Julia programming language.

  • Memory efficient building blocks for invertible neural networks
  • Hand-derived gradients, Jacobians $J$ , and $\log |J|$
  • Flux integration
  • Support for Zygote and ChainRules
  • GPU support
  • Includes various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification

Installation

InvertibleNetworks is registered and can be added like any standard julia package with the command:

] add InvertibleNetworks

Papers

The following publications use InvertibleNetworks.jl:

Building blocks

  • 1x1 Convolutions using Householder transformations (example)

  • Residual block (example)

  • Invertible coupling layer from Dinh et al. (2017) (example)

  • Invertible hyperbolic layer from Lensink et al. (2019) (example)

  • Invertible coupling layer from Putzky and Welling (2019) (example)

  • Invertible recursive coupling layer HINT from Kruse et al. (2020) (example)

  • Activation normalization (Kingma and Dhariwal, 2018) (example)

  • Various activation functions (Sigmoid, ReLU, leaky ReLU, GaLU)

  • Objective and misfit functions (mean squared error, log-likelihood)

  • Dimensionality manipulation: squeeze/unsqueeze (column, patch, checkerboard), split/cat

  • Squeeze/unsqueeze using the wavelet transform

Examples

  • Invertible recurrent inference machines (Putzky and Welling, 2019) (generic example)

  • Generative models with maximum likelihood via the change of variable formula (example)

  • Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) (generic example, source)

GPU support

GPU support is supported via Flux/CuArray. To use the GPU, move the input and the network layer to GPU via |> gpu

using InvertibleNetworks, Flux

# Input
nx = 64
ny = 64
k = 10
batchsize = 4

# Input image: nx x ny x k x batchsize
X = randn(Float32, nx, ny, k, batchsize) |> gpu

# Activation normalization
AN = ActNorm(k; logdet=true) |> gpu

# Test invertibility
Y_, logdet = AN.forward(X)

Acknowledgments

This package uses functions from NNlib.jl, Flux.jl and Wavelets.jl

References

  • Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. https://arxiv.org/pdf/1612.08083.pdf

  • Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, https://arxiv.org/abs/1605.08803

  • Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. https://arxiv.org/abs/1807.03039

  • Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. https://arxiv.org/abs/1905.10484

  • Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. https://arxiv.org/abs/1911.10914

  • Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. https://arxiv.org/abs/1905.10687

Authors

  • Philipp Witte, Georgia Institute of Technolgy (now Microsoft)

  • Gabrio Rizzuti, Utrecht University

  • Mathias Louboutin, Georgia Institute of Technology

  • Ali Siahkoohi, Georgia Institute of Technology

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].