All Projects → ml-uol → prosper

ml-uol / prosper

Licence: AFL-3.0 license
A Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to prosper

ReactiveMP.jl
Julia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (+241.18%)
Mutual labels:  variational-inference, probabilistic-graphical-models
MvMM-RegNet
Code for paper: MvMM-RegNet: A new image registration framework based on multivariate mixture model and neural network estimation
Stars: ✭ 22 (+29.41%)
Mutual labels:  probabilistic-graphical-models
MCF-3D-CNN
Temporal-spatial Feature Learning of DCE-MR Images via 3DCNN
Stars: ✭ 43 (+152.94%)
Mutual labels:  feature-learning
adaptive-f-divergence
A tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (+17.65%)
Mutual labels:  variational-inference
bnp
Bayesian nonparametric models for python
Stars: ✭ 17 (+0%)
Mutual labels:  probabilistic-graphical-models
active-inference
A toy model of Friston's active inference in Tensorflow
Stars: ✭ 36 (+111.76%)
Mutual labels:  variational-inference
vireo
Demultiplexing pooled scRNA-seq data with or without genotype reference
Stars: ✭ 34 (+100%)
Mutual labels:  variational-inference
probai-2021-pyro
Repo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (+164.71%)
Mutual labels:  variational-inference
BayesByHypernet
Code for the paper Implicit Weight Uncertainty in Neural Networks
Stars: ✭ 63 (+270.59%)
Mutual labels:  variational-inference
SelSum
Abstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference paper.
Stars: ✭ 36 (+111.76%)
Mutual labels:  variational-inference
rss
Regression with Summary Statistics.
Stars: ✭ 42 (+147.06%)
Mutual labels:  variational-inference
ClassifierToolbox
A MATLAB toolbox for classifier: Version 1.0.7
Stars: ✭ 72 (+323.53%)
Mutual labels:  sparse-coding
AI Learning Hub
AI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (+211.76%)
Mutual labels:  variational-inference
ccube
Bayesian mixture models for estimating and clustering cancer cell fractions
Stars: ✭ 23 (+35.29%)
Mutual labels:  variational-inference
VINF
Repository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (+35.29%)
Mutual labels:  variational-inference
MIRT.jl
MIRT: Michigan Image Reconstruction Toolbox (Julia version)
Stars: ✭ 80 (+370.59%)
Mutual labels:  sparse-coding
PyLDA
A Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (+200%)
Mutual labels:  variational-inference
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (+70.59%)
Mutual labels:  variational-inference
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+135.29%)
Mutual labels:  variational-inference
dictlearn
Dictionary Learning for image processing
Stars: ✭ 23 (+35.29%)
Mutual labels:  sparse-coding

Build Status Documentation Status

Introduction

This package contains all the source code to reproduce the numerical experiments described in the paper. It contains a parallelized implementation of the Binary Sparse Coding (BSC) [1], Gaussian Sparse Coding (GSC) [2], Maximum Causes Analysis (MCA) [3], Maximum Magnitude Causes Analysis (MMCA) [4], Ternary Sparse Coding (TSC) [5], and Discrete Sparse Coding [7] models. All these probabilistic generative models are trained using a truncated Expectation Maximization (EM) algorithm [6].

Software dependencies

Python related dependencies can be installed using:

  $ pip install -r requirements.txt

MPI4PY also requires a system level installation of MPI. You can do that on MacOS using Homebrew:

  $ brew install mpich

for Ubuntu systems:

  $ sudo apt install mpich

for any other system you might wish to review the relevent section of the MPI4PY installation guidelines

Overview

prosper/ - Python library/framework for MPI parallelized EM-based algorithms. The models' implementations can be found in prosper/em/camodels/.

examples/ - Small examples for initializing and running the models

Installation

To install the library run:

  $ git clone https://github.com/ml-uol/prosper.git
  $ cd prosper
  $ python setup.py install

Optionally you can replace the final line with:

  $ python setup.py develop

This option installs the library using links and it allows the user to edit the library without reinstalling it (useful for Prosper developers).

Running

To run some toy examples:

  $ cd examples/barstest
  $ python bars-learning-and-inference.py param-bars-<...>.py

where <...> should be appropriately replaced to correspond to one of the parameter files available in the directory. The bars-run-all.py script should then initialize and run the algorithm which corresponds to the chosen parameter file.

Results/Output

The results produced by the code are stored in a 'results.h5' file under "./output/.../". The file stores the model parameters (e.g., W, pi etc.) for each EM iteration performed. To read the results file, you can use openFile function of the standard tables package in python. Moreover, the results files can also be easily read by other packages such as Matlab etc.

Running on a parallel architecture

The code uses MPI based parallelization. If you have parallel resources (i.e., a multi-core system or a compute cluster), the provided code can make a use of parallel compute resources by evenly distributing the training data among multiple cores.

To run the same script as above, e.g.,

a) On a multi-core machine with 32 cores:

$ mpirun -np 32 bars-learning-and-inference.py param-bars-<...>.py

b) On a cluster:

$ mpirun --hostfile machines python bars-learning-and-inference.py param-bars-<...>.py

where 'machines' contains a list of suitable machines.

See your MPI documentation for the details on how to start MPI parallelized programs.

References

[1] M. Henniges, G. Puertas, J. Bornschein, J. Eggert, and J. Lücke (2010). Binary Sparse Coding. Proc. LVA/ICA 2010, LNCS 6365, 450-457.

[2] A.-S. Sheikh, J. A. Shelton, J. Lücke (2014). A Truncated EM Approach for Spike-and-Slab Sparse Coding. Journal of Machine Learning Research, 15:2653-2687.

[3] G. Puertas, J. Bornschein, and J. Lücke (2010). The Maximal Causes of Natural Scenes are Edge Filters. Advances in Neural Information Processing Systems 23, 1939-1947.

[4] J. Bornschein, M. Henniges, J. Lücke (2013). Are V1 simple cells optimized for visual occlusions? A comparative study. PLOS Computational Biology 9(6): e1003062.

[5] G. Exarchakis, M. Henniges, J. Eggert, and J. Lücke (2012). Ternary Sparse Coding. International Conference on Latent Variable Analysis and Signal Separation (LVA/ICA), 204-212.

[6] J. Lücke and J. Eggert (2010). Expectation Truncation and the Benefits of Preselection in Training Generative Models. Journal of Machine Learning Research 11:2855-2900.

[7] G. Exarchakis, and J. Lücke (2017). Discrete Sparse Coding. Neural Computation, 29(11), 2979-3013.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].