All Projects → siddsax → Normalizing_Flows

siddsax / Normalizing_Flows

Licence: other
Implementation of Normalizing flows on MNIST https://arxiv.org/abs/1505.05770

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Normalizing Flows

TensorFlow-OS-ELM
A tensorflow implementation of OS-ELM (Online Sequential Extreme Learning Machine)
Stars: ✭ 46 (+228.57%)
Mutual labels:  implementation
Machine-Learning-Algorithms-From-Scratch
A collection of commonly used machine learning algorithms implemented in Python/Numpy
Stars: ✭ 43 (+207.14%)
Mutual labels:  implementation
erc721
The reference implementation of the ERC-721 non-fungible token standard.
Stars: ✭ 989 (+6964.29%)
Mutual labels:  implementation
iris
The interpreter of ISLisp
Stars: ✭ 58 (+314.29%)
Mutual labels:  implementation
MinecraftC
A Raytraced Minecraft Classic 0.0.30a port to C
Stars: ✭ 250 (+1685.71%)
Mutual labels:  implementation
amongus-protocol
An implementation of the Among Us protocol in typescript.
Stars: ✭ 56 (+300%)
Mutual labels:  implementation
colocat
Fegeya Colocat, Colorized 'cat' implementation. Written in C++17.
Stars: ✭ 14 (+0%)
Mutual labels:  implementation
laravository
Simplified Repository pattern implementation in Laravel
Stars: ✭ 14 (+0%)
Mutual labels:  implementation
open source start
Go through the readme... fork ....add....send a pull request .... get yourself in the contribution list...Plant the tree
Stars: ✭ 10 (-28.57%)
Mutual labels:  implementation
continuous-time-flow-process
PyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (+142.86%)
Mutual labels:  normalizing-flows
RCNN
Step-By-Step Implementation of R-CNN from scratch in python
Stars: ✭ 75 (+435.71%)
Mutual labels:  implementation
prunnable-layers-pytorch
Prunable nn layers for pytorch.
Stars: ✭ 47 (+235.71%)
Mutual labels:  implementation
MongeAmpereFlow
Continuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (+107.14%)
Mutual labels:  normalizing-flows
bytenet translation
A TensorFlow Implementation of Machine Translation In Neural Machine Translation in Linear Time
Stars: ✭ 60 (+328.57%)
Mutual labels:  implementation
InvertibleNetworks.jl
A Julia framework for invertible neural networks
Stars: ✭ 86 (+514.29%)
Mutual labels:  normalizing-flows
NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+350%)
Mutual labels:  normalizing-flows
deeprob-kit
A Python Library for Deep Probabilistic Modeling
Stars: ✭ 32 (+128.57%)
Mutual labels:  normalizing-flows
ifl-tpp
Implementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (+314.29%)
Mutual labels:  normalizing-flows
constant-memory-waveglow
PyTorch implementation of NVIDIA WaveGlow with constant memory cost.
Stars: ✭ 36 (+157.14%)
Mutual labels:  normalizing-flows
Leetcode-solutions
Leetcode Grinder.
Stars: ✭ 14 (+0%)
Mutual labels:  implementation

Normalizing flows to generate MNIST Digits

This code base implements Normalizing Flows as proposed in Rezende et al. to generate MNIST digits using Tensorflow.

Usage:

python main.py [plot_or_not] [number of flows]

No need to download MNIST, tensorflow does it for you!

Outputs

  • 10 files with latent states of each number
  • 1 file with combined latent states
  • Graph plotting latent states as in the diagram if plot_or_not=1
  • Folder names Out is generated containing samples of generations after interval of 100 iterations

2D Latent Vector Representation, Left is vanilla VAE, Right is with Normalizing flows. As can be seen, the one with normalizing flows has a flexible multi-modal distribution, opposed to unimodal gaussian for vanilla vae

If you use the code base, please cite us at

@article{saxena2017variational,
  title={Variational Inference via Transformations on Distributions},
  author={Saxena, Siddhartha and Dohare, Shibhansh and Kapoor, Jaivardhan},
  journal={arXiv preprint arXiv:1707.02510},
  year={2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].