All Projects → AWehenkel → UMNN

AWehenkel / UMNN

Licence: BSD-3-Clause License
Implementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.

Programming Languages

python
139335 projects - #7 most used programming language
CSS
56736 projects
HTML
75241 projects

Projects that are alternatives of or similar to UMNN

gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-68.25%)
Mutual labels:  normalizing-flows
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-63.49%)
Mutual labels:  normalizing-flows
normalizing-flows
Implementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-76.19%)
Mutual labels:  normalizing-flows
introduction to normalizing flows
Jupyter Notebook corresponding to 'Going with the Flow: An Introduction to Normalizing Flows'
Stars: ✭ 21 (-66.67%)
Mutual labels:  normalizing-flows
cflow-ad
Official PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+119.05%)
Mutual labels:  normalizing-flows
Normalizing Flows
Implementation of Normalizing flows on MNIST https://arxiv.org/abs/1505.05770
Stars: ✭ 14 (-77.78%)
Mutual labels:  normalizing-flows
ifl-tpp
Implementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (-7.94%)
Mutual labels:  normalizing-flows
constant-memory-waveglow
PyTorch implementation of NVIDIA WaveGlow with constant memory cost.
Stars: ✭ 36 (-42.86%)
Mutual labels:  normalizing-flows
InvertibleNetworks.jl
A Julia framework for invertible neural networks
Stars: ✭ 86 (+36.51%)
Mutual labels:  normalizing-flows
continuous-time-flow-process
PyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (-46.03%)
Mutual labels:  normalizing-flows
MongeAmpereFlow
Continuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (-53.97%)
Mutual labels:  normalizing-flows
deeprob-kit
A Python Library for Deep Probabilistic Modeling
Stars: ✭ 32 (-49.21%)
Mutual labels:  normalizing-flows
NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+0%)
Mutual labels:  normalizing-flows
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+1822.22%)
Mutual labels:  normalizing-flows
flowtorch-old
Separating Normalizing Flows code from Pyro and improving API
Stars: ✭ 36 (-42.86%)
Mutual labels:  normalizing-flows
nessai
nessai: Nested Sampling with Artificial Intelligence
Stars: ✭ 18 (-71.43%)
Mutual labels:  normalizing-flows
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-22.22%)
Mutual labels:  normalizing-flows
clock
High-resolution clock functions: monotonic, realtime, cputime.
Stars: ✭ 52 (-17.46%)
Mutual labels:  monotonic

Unconstrained Monotonic Neural Networks (UMNN)

Official implementation of Unconstrained Monotonic Neural Networks (UMNN) and the experiments presented in the paper:

Antoine Wehenkel and Gilles Louppe. "Unconstrained Monotonic Neural Networks." (2019). [arxiv]

Dependencies

The code has been tested with Pytorch 1.1 and Python3.6. Some code to draw figures and load dataset are taken from FFJORD and Sylvester normalizing flows for variational inference.

Usage

Simple Monotonic Function

This experiment is not described in the paper. We create the following dataset: x = [x_1, x_2, x_3] is drawn from a multivariate Gaussian, y = 0.001(x_1^3 + x_1) + x_2 + sin(x_3). We suppose that we are given the information about the monotonicity of y with respect to x_1.

python MonotonicMLP.py 

In this experiment we show that a classical MLP won't be able to model a function that is monotonic with respect to x_1 because its effect is small in comparison to the other variables. The UMNN performs better than an MLP while ensuring that the output is monotonic with respect to x_1.

Toy Experiments

python ToyExperiments.py 

See ToyExperiments.py for optional arguments.

MNIST

python MNISTExperiment.py

See MNISTExperiment.py for optional arguments.

UCI Dataset

You have to download the datasets with the following command:

python datasets/download_datasets.py 

Then you can execute:

python UCIExperiments.py --data ['power', 'gas', 'hepmass', 'miniboone', 'bsds300']

See UCIExperiments.py for optional arguments.

VAE

You have to download the datasets:

  • MNIST:
python datasets/download_datasets.py
  • OMNIGLOT: the dataset can be downloaded from link;
  • Caltech 101 Silhouettes: the dataset can be downloaded from link.
  • Frey Faces: the dataset can be downloaded from link.
python TrainVaeFlow.py -d ['mnist', 'freyfaces', 'omniglot', 'caltech']

Other Usage

All the files related to the implementation of UMNN (Conditionner network, Integrand Network and Integral) are located in the folder models/UMNN.

  • NeuralIntegral.py computes the integral of a neural network (with 1d output) using the Clenshaw-Curtis(CC) quadrature, it computes sequentially the different evaluation points required by CC.
  • ParallelNeuralIntegral.py processes all the evaluation points at once making the computation almost as fast as the forward evaluation the net but to the price of a higher memory cost.
  • UMNNMAF.py contains the implementation of the different networks required by UMNN.
  • UMNNMAFFlow.py contains the implementation of flows made of UMNNs.
  • Check here if you are interested by modeling functions that are monotonic with respect to more than one input variable. (Do not hesitate to contact me for more details)

Cite

If you make use of this code in your own work, please cite our paper:

@inproceedings{wehenkel2019unconstrained,
  title={Unconstrained monotonic neural networks},
  author={Wehenkel, Antoine and Louppe, Gilles},
  booktitle={Advances in Neural Information Processing Systems},
  pages={1543--1553},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].