NanoFlowPyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+96.88%)
flowtorch-oldSeparating Normalizing Flows code from Pyro and improving API
Stars: ✭ 36 (+12.5%)
blangSDKBlang's software development kit
Stars: ✭ 21 (-34.37%)
MMCAcovid19.jlMicroscopic Markov Chain Approach to model the spreading of COVID-19
Stars: ✭ 15 (-53.12%)
deepdb-publicImplementation of DeepDB: Learn from Data, not from Queries!
Stars: ✭ 61 (+90.63%)
probai-2019Materials of the Nordic Probabilistic AI School 2019.
Stars: ✭ 127 (+296.88%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+3684.38%)
Probability TheoryA quick introduction to all most important concepts of Probability Theory, only freshman level of mathematics needed as prerequisite.
Stars: ✭ 25 (-21.87%)
probabilistic-circuitsA curated collection of papers on probabilistic circuits, computational graphs encoding tractable probability distributions.
Stars: ✭ 33 (+3.13%)
probai-2021Materials of the Nordic Probabilistic AI School 2021.
Stars: ✭ 83 (+159.38%)
probai-2021-pyroRepo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (+40.63%)
mtaMulti-Touch Attribution
Stars: ✭ 60 (+87.5%)
artificial neural networksA collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+25%)
probnmn-clevrCode for ICML 2019 paper "Probabilistic Neural-symbolic Models for Interpretable Visual Question Answering" [long-oral]
Stars: ✭ 63 (+96.88%)
nessainessai: Nested Sampling with Artificial Intelligence
Stars: ✭ 18 (-43.75%)
score flowOfficial code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (+53.13%)
UMNNImplementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
Stars: ✭ 63 (+96.88%)
semi-supervised-NFsCode for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-28.12%)
normalizing-flowsImplementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-53.12%)
cflow-adOfficial PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+331.25%)
Normalizing FlowsImplementation of Normalizing flows on MNIST https://arxiv.org/abs/1505.05770
Stars: ✭ 14 (-56.25%)
ifl-tppImplementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (+81.25%)
continuous-time-flow-processPyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (+6.25%)
MongeAmpereFlowContinuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (-9.37%)
spynSum-Product Network learning routines in python
Stars: ✭ 21 (-34.37%)
markovian🎲 A Kotlin DSL for probabilistic programming.
Stars: ✭ 13 (-59.37%)