All Projects → deeprob-org → deeprob-kit

deeprob-org / deeprob-kit

Licence: MIT license
A Python Library for Deep Probabilistic Modeling

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to deeprob-kit

flowtorch-old
Separating Normalizing Flows code from Pyro and improving API
Stars: ✭ 36 (+12.5%)
Mutual labels:  probabilistic-models, normalizing-flows
NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+96.88%)
Mutual labels:  probabilistic-models, normalizing-flows
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-37.5%)
Mutual labels:  normalizing-flows
probai-2019
Materials of the Nordic Probabilistic AI School 2019.
Stars: ✭ 127 (+296.88%)
Mutual labels:  probabilistic-models
probai-2021-pyro
Repo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (+40.63%)
Mutual labels:  probabilistic-models
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (+53.13%)
Mutual labels:  normalizing-flows
probabilistic-circuits
A curated collection of papers on probabilistic circuits, computational graphs encoding tractable probability distributions.
Stars: ✭ 33 (+3.13%)
Mutual labels:  probabilistic-models
normalizing-flows
Implementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-53.12%)
Mutual labels:  normalizing-flows
MMCAcovid19.jl
Microscopic Markov Chain Approach to model the spreading of COVID-19
Stars: ✭ 15 (-53.12%)
Mutual labels:  probabilistic-models
mta
Multi-Touch Attribution
Stars: ✭ 60 (+87.5%)
Mutual labels:  probabilistic-models
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+3684.38%)
Mutual labels:  normalizing-flows
Machine Learning From Scratch
Machine Learning models from scratch with a better visualisation
Stars: ✭ 15 (-53.12%)
Mutual labels:  probabilistic-models
nessai
nessai: Nested Sampling with Artificial Intelligence
Stars: ✭ 18 (-43.75%)
Mutual labels:  normalizing-flows
Probability Theory
A quick introduction to all most important concepts of Probability Theory, only freshman level of mathematics needed as prerequisite.
Stars: ✭ 25 (-21.87%)
Mutual labels:  probabilistic-models
UMNN
Implementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
Stars: ✭ 63 (+96.88%)
Mutual labels:  normalizing-flows
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-28.12%)
Mutual labels:  normalizing-flows
probai-2021
Materials of the Nordic Probabilistic AI School 2021.
Stars: ✭ 83 (+159.38%)
Mutual labels:  probabilistic-models
blangSDK
Blang's software development kit
Stars: ✭ 21 (-34.37%)
Mutual labels:  probabilistic-models
deepdb-public
Implementation of DeepDB: Learn from Data, not from Queries!
Stars: ✭ 61 (+90.63%)
Mutual labels:  sum-product-networks
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+25%)
Mutual labels:  probabilistic-models

MIT license PyPI version codecov Continuous Integration Documentation Status

Logo

DeeProb-kit

DeeProb-kit is a general-purpose Python library providing a collection of deep probabilistic models (DPMs) which are easy to use and extend. It also includes efficiently implemented learning techniques, inference routines and statistical algorithms. The availability of a representative selection of the most common DPMs in a single library makes it possible to combine them in a straightforward manner, a common practice in deep learning research nowadays, which however is still missing for certain class of models. Moreover, DeeProb-kit provides high-quality fully-documented APIs, and it will help the community to accelerate research on DPMs as well as improve experiments' reproducibility.

Features

  • Inference algorithms for SPNs. 1 2
  • Learning algorithms for SPNs structure. 1 3 4 2 5
  • Chow-Liu Trees (CLT) as SPN leaves. 6 7
  • Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. 8 9
  • Structural marginalization and pruning algorithms for SPNs.
  • High-order moments computation for SPNs.
  • JSON I/O operations for SPNs and CLTs. 2
  • Plotting operations based on NetworkX for SPNs and CLTs. 2
  • Randomized And Tensorized SPNs (RAT-SPNs). 10
  • Deep Generalized Convolutional SPNs (DGC-SPNs). 11
  • Masked Autoregressive Flows (MAFs). 12
  • Real Non-Volume-Preserving (RealNVP) flows. 13
  • Non-linear Independent Component Estimation (NICE) flows. 14

The collection of implemented models is summarized in the following table.

Model Description
Binary-CLT Binary Chow-Liu Tree (CLT)
SPN Vanilla Sum-Product Network
MSPN Mixed Sum-Product Network
XPC Random Probabilistic Circuit
RAT-SPN Randomized and Tensorized Sum-Product Network
DGC-SPN Deep Generalized Convolutional Sum-Product Network
MAF Masked Autoregressive Flow
NICE Non-linear Independent Components Estimation Flow
RealNVP Real-valued Non-Volume-Preserving Flow

Installation

The library can be installed either from PIP repository or by source code.

# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/deeprob-kit.git@main#egg=deeprob-kit

Project Directories

The documentation is generated automatically by Sphinx using sources stored in the docs directory.

A collection of code examples and experiments can be found in the examples and experiments directories respectively. Moreover, benchmark code can be found in the benchmark directory.

Related Repositories

References

Footnotes

  1. Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015). 2

  2. Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019). 2 3 4

  3. Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011).

  4. Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018).

  5. Di Mauro et al. Sum-Product Network structure learning by efficient product nodes discovery. AIxIA (2018).

  6. Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014).

  7. Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021).

  8. Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016).

  9. Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).

  10. Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020).

  11. Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020).

  12. Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017).

  13. Dinh et al. Density Estimation using RealNVP. ICLR (2017).

  14. Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].