All Projects → suriyadeepan → Pm Pyro

suriyadeepan / Pm Pyro

Licence: gpl-3.0
PyMC3-like Interface for Pyro

Projects that are alternatives of or similar to Pm Pyro

Seismic Transfer Learning
Deep-learning seismic facies on state-of-the-art CNN architectures
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Ml Weekly
My weekly of machine learning. Collection of implemented algorithms.
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Omx
Open Matrix (OMX)
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Itri Speech Recognition Dataset Generation
Automatic Speech Recognition Dataset Generation
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Densedepth
High Quality Monocular Depth Estimation via Transfer Learning
Stars: ✭ 963 (+2818.18%)
Mutual labels:  jupyter-notebook
2016learnpython
Python Teaching, Seminars for 2nd year students of School of Linguistics NRU HSE
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Pn2v
This is our implementation of Probabilistic Noise2Void
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Gaze Estimation
A deep learning based gaze estimation framework implemented with PyTorch
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Mooc Neurons And Synapses 2017
Reference data for the "Simulation Neuroscience:Neurons and Synapses" Massive Online Open Course
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Madmom tutorials
Tutorials for the madmom package.
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Tf eager
Exercises for tf (eager) learning
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Nbmultitask
multithreading/multiprocessing with ipywidgets and jupyter notebooks
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Pyhat
Python Hyperspectral Analysis Tools
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Ndap Fa2018
neuro data analysis in python
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Machinelearningdeeplearning
李宏毅2021机器学习深度学习笔记PPT作业
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Pytorch Mnist Vae
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Musings
Repo of some side-issues/Sub-problems and their solutions I have encountered in my ML work in past few years.
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Yolact Tutorial
A tutorial for using YOLACT in Google Colab
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Lectures2020
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Pydata Amsterdam 2016
Machine Learning with Scikit-Learn (material for pydata Amsterdam 2016)
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook

PyMC3-like abstractions for pyro's stochastic function. Define a model as a stochastic function in pyro. Use pm_like wrapper to create a PyMC3-esque Model. Random variables are exposed to user as attributes of Model. pm-pyro provides abstractions for sampling-based inference methods (NUTS - The No-U-Turn Sampler, HMC - Hamiltonion Monte Carlo), as well as Variational Inference (SVI with autoguides), trace plots, posterior plot and posterior predictive plots.

Install

Install from pypi

pip install pm-pyro

Developer setup

# install requirements
pip install -r requirements-dev.txt
# run tests
python -m pytest pmpyro/tests.py

Example

Borrowed the example from a PyMC3 tutorial. The outcome variable Y is dependent on 2 features X_1 and X_2. The notebook for this example is available here

Model Specification

We design a simple Bayesian Linear Regression model.

Stochastic Function

The model specification is implemented as a stochastic function.

import pyro.distributions as dist
import pyro
import torch

def pyro_model(x1, x2, y):
    alpha = pyro.sample('alpha', dist.Normal(0, 10))
    beta = pyro.sample('beta',pdist.Normal(torch.zeros(2,), torch.ones(2,) * 10.))
    sigma = pyro.sample('sigma', dist.HalfNormal(1.))

    # Expected value of outcome
    mu = alpha + beta[0] * x1 + beta[1] * x2

    # Likelihood (sampling distribution) of observations
    return pyro.sample('y_obs', dist.Normal(mu, sigma), obs=y)

Context-manager Syntax

The pm_like wrapper creates a PyMC3-esque Model. We can use the context manager syntax for running inference. pm.sample samples from the model using the NUTS sampler. The trace is a python dictionary which contains the samples.

from pmpyro import pm_like
import pmpyro as pm

with pm_like(pyro_model, X1, X2, Y) as model:
    trace = pm.sample(1000)
sample: 100%|██████████| 1300/1300 [00:16, 80.42it/s, step size=7.49e-01, acc. prob=0.911] 

Traceplot

We can visualize the samples using traceplot. Select random variables by passing them as a list via var_names = [ 'alpha' ... ] argument.

pm.traceplot(trace)

Plot Posterior

Visualize posterior of random variables using plot_posterior.

pm.plot_posterior(trace, var_names=['beta'])

Posterior Predictive Samples

We can sample from the posterior by running plot_posterior_predictive or sample_posterior_predictive with the same function signatures as the stochastic function def pyro_model(x1, x2, y), replacing observed variable Y with None.

ppc = pm.plot_posterior_predictive(X1, X2, None,
                          trace=trace, model=model, samples=60,
                          alpha=0.08, obs={'y_obs' : Y})

Trace Summary

The summary of random variables is available as a pandas array.

pm.summary()

License

This project is licensed under the GPL v3 License - see the LICENSE.md file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].