DeeProb-kit
DeeProb-kit is a general-purpose Python library providing a collection of deep probabilistic models (DPMs) which are easy to use and extend. It also includes efficiently implemented learning techniques, inference routines and statistical algorithms. The availability of a representative selection of the most common DPMs in a single library makes it possible to combine them in a straightforward manner, a common practice in deep learning research nowadays, which however is still missing for certain class of models. Moreover, DeeProb-kit provides high-quality fully-documented APIs, and it will help the community to accelerate research on DPMs as well as improve experiments' reproducibility.
Features
- Inference algorithms for SPNs. 1 2
- Learning algorithms for SPNs structure. 1 3 4 2 5
- Chow-Liu Trees (CLT) as SPN leaves. 6 7
- Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. 8 9
- Structural marginalization and pruning algorithms for SPNs.
- High-order moments computation for SPNs.
- JSON I/O operations for SPNs and CLTs. 2
- Plotting operations based on NetworkX for SPNs and CLTs. 2
- Randomized And Tensorized SPNs (RAT-SPNs). 10
- Deep Generalized Convolutional SPNs (DGC-SPNs). 11
- Masked Autoregressive Flows (MAFs). 12
- Real Non-Volume-Preserving (RealNVP) flows. 13
- Non-linear Independent Component Estimation (NICE) flows. 14
The collection of implemented models is summarized in the following table.
Model | Description |
---|---|
Binary-CLT | Binary Chow-Liu Tree (CLT) |
SPN | Vanilla Sum-Product Network |
MSPN | Mixed Sum-Product Network |
XPC | Random Probabilistic Circuit |
RAT-SPN | Randomized and Tensorized Sum-Product Network |
DGC-SPN | Deep Generalized Convolutional Sum-Product Network |
MAF | Masked Autoregressive Flow |
NICE | Non-linear Independent Components Estimation Flow |
RealNVP | Real-valued Non-Volume-Preserving Flow |
Installation
The library can be installed either from PIP repository or by source code.
# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/deeprob-kit.git@main#egg=deeprob-kit
Project Directories
The documentation is generated automatically by Sphinx using sources stored in the docs directory.
A collection of code examples and experiments can be found in the examples and experiments directories respectively. Moreover, benchmark code can be found in the benchmark directory.
Related Repositories
References
Footnotes
-
Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015).
↩ ↩ 2 -
Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019).
↩ ↩ 2↩ 3↩ 4 -
Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011).
↩ -
Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018).
↩ -
Di Mauro et al. Sum-Product Network structure learning by efficient product nodes discovery. AIxIA (2018).
↩ -
Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014).
↩ -
Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021).
↩ -
Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016).
↩ -
Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).
↩ -
Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020).
↩ -
Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020).
↩ -
Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017).
↩ -
Dinh et al. Density Estimation using RealNVP. ICLR (2017).
↩ -
Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015).
↩