All Projects → erdogant → Bnlearn

erdogant / Bnlearn

Licence: other
Python package for learning the graphical structure of Bayesian networks, parameter learning, inference and sampling methods.

Projects that are alternatives of or similar to Bnlearn

Celeste.jl
Scalable inference for a generative model of astronomical images
Stars: ✭ 142 (+178.43%)
Mutual labels:  jupyter-notebook, bayesian-inference
Bayesian Analysis Recipes
A collection of Bayesian data analysis recipes using PyMC3
Stars: ✭ 479 (+839.22%)
Mutual labels:  jupyter-notebook, bayesian-inference
Rethinking Tensorflow Probability
Statistical Rethinking (2nd Ed) with Tensorflow Probability
Stars: ✭ 152 (+198.04%)
Mutual labels:  jupyter-notebook, bayesian-inference
A Nice Mc
Code for "A-NICE-MC: Adversarial Training for MCMC"
Stars: ✭ 115 (+125.49%)
Mutual labels:  jupyter-notebook, bayesian-inference
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+1664.71%)
Mutual labels:  jupyter-notebook, bayesian-inference
Rethinking Pyro
Statistical Rethinking with PyTorch and Pyro
Stars: ✭ 116 (+127.45%)
Mutual labels:  jupyter-notebook, bayesian-inference
Vae cf
Variational autoencoders for collaborative filtering
Stars: ✭ 386 (+656.86%)
Mutual labels:  jupyter-notebook, bayesian-inference
Pymc Example Project
Example PyMC3 project for performing Bayesian data analysis using a probabilistic programming approach to machine learning.
Stars: ✭ 90 (+76.47%)
Mutual labels:  jupyter-notebook, bayesian-inference
Bda py demos
Bayesian Data Analysis demos for Python
Stars: ✭ 781 (+1431.37%)
Mutual labels:  jupyter-notebook, bayesian-inference
Dbda Python
Doing Bayesian Data Analysis, 2nd Edition (Kruschke, 2015): Python/PyMC3 code
Stars: ✭ 502 (+884.31%)
Mutual labels:  jupyter-notebook, bayesian-inference
Pymc3 vs pystan
Personal project to compare hierarchical linear regression in PyMC3 and PyStan, as presented at http://pydata.org/london2016/schedule/presentation/30/ video: https://www.youtube.com/watch?v=Jb9eklfbDyg
Stars: ✭ 110 (+115.69%)
Mutual labels:  jupyter-notebook, bayesian-inference
Resources
PyMC3 educational resources
Stars: ✭ 930 (+1723.53%)
Mutual labels:  jupyter-notebook, bayesian-inference
Neural Tangents
Fast and Easy Infinite Neural Networks in Python
Stars: ✭ 1,357 (+2560.78%)
Mutual labels:  jupyter-notebook, bayesian-inference
Glmm In Python
Generalized linear mixed-effect model in Python
Stars: ✭ 131 (+156.86%)
Mutual labels:  jupyter-notebook, bayesian-inference
Bayesian Cognitive Modeling In Pymc3
PyMC3 codes of Lee and Wagenmakers' Bayesian Cognitive Modeling - A Pratical Course
Stars: ✭ 93 (+82.35%)
Mutual labels:  jupyter-notebook, bayesian-inference
Sbi
Simulation-based inference in PyTorch
Stars: ✭ 164 (+221.57%)
Mutual labels:  jupyter-notebook, bayesian-inference
Alice
NIPS 2017: ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching
Stars: ✭ 80 (+56.86%)
Mutual labels:  jupyter-notebook, bayesian-inference
Bayesian Stats Modelling Tutorial
How to do Bayesian statistical modelling using numpy and PyMC3
Stars: ✭ 480 (+841.18%)
Mutual labels:  jupyter-notebook, bayesian-inference
Pycurious
Python package for computing the Curie depth from the magnetic anomaly
Stars: ✭ 22 (-56.86%)
Mutual labels:  jupyter-notebook, bayesian-inference
Hmm for autonomous driving
🎓 Educational application of Hidden Markov Model to Autonomous Driving 🚕🚙🚗
Stars: ✭ 39 (-23.53%)
Mutual labels:  jupyter-notebook, bayesian-inference

bnlearn - Graphical structure of Bayesian networks

Python PyPI Version License Coffee Github Forks GitHub Open Issues Project Status Downloads Downloads Sphinx Open In Colab

Star it if you like it!

bnlearn is Python package for learning the graphical structure of Bayesian networks, parameter learning, inference and sampling methods. This work is inspired by the R package (bnlearn.com) that has been very usefull to me for many years. Although there are very good Python packages for probabilistic graphical models, it still can remain difficult (and somethimes unnecessarily) to (re)build certain pipelines. Bnlearn for python (this package) is build on the pgmpy package and contains the most-wanted pipelines. Navigate to API documentations for more detailed information.

Method overview

Learning a Bayesian network can be split into two problems which are both implemented in this package:

  • Structure learning: Given a set of data samples, estimate a DAG that captures the dependencies between the variables.
  • Parameter learning: Given a set of data samples and a DAG that captures the dependencies between the variables, estimate the (conditional) probability distributions of the individual variables.

The following functions are available after installation:

# Import library
import bnlearn as bn

# Structure learning
bn.structure_learning.fit()

# Parameter learning
bn.parameter_learning.fit()

# Inference
bn.inference.fit()

# Based on a DAG, you can sample the number of samples you want.
bn.sampling()

# Load well known examples to play arround with or load your own .bif file.
bn.import_DAG()

# Load simple dataframe of sprinkler dataset.
bn.import_example()

# Compare 2 graphs
bn.compare_networks()

# Plot graph
bn.plot()

# To make the directed grapyh undirected
bn.to_undirected()

# Convert to one-hot datamatrix
bn.df2onehot()
 
# See below for the exact working of the functions

The following methods are also included:

  • inference
  • sampling
  • comparing two networks
  • loading bif files
  • conversion of directed to undirected graphs

Conda installation

It is advisable to create a new environment.

conda create -n env_bnlearn python=3.8
conda activate env_bnlearn

Conda installation

conda install -c ankurankan pgmpy
pip install -U bnlearn # -U is to force download latest version

Pip installation

pip install -U pgmpy>=0.1.13
pip install -U bnlearn # -U is to force to overwrite current version
  • Alternatively, install bnlearn from the GitHub source:
git clone https://github.com/erdogant/bnlearn.git
cd bnlearn
pip install -U .

Import bnlearn package

import bnlearn as bn

Example: Structure Learning

# Example dataframe sprinkler_data.csv can be loaded with: 
df = bn.import_example()
# df = pd.read_csv('sprinkler_data.csv')
model = bn.structure_learning.fit(df)
G = bn.plot(model)

df looks like this

     Cloudy  Sprinkler  Rain  Wet_Grass
0         0          1     0          1
1         1          1     1          1
2         1          0     1          1
3         0          0     1          1
4         1          0     1          1
..      ...        ...   ...        ...
995       0          0     0          0
996       1          0     0          0
997       0          0     1          0
998       1          1     0          1
999       1          0     1          1

  • Choosing various methodtypes and scoringtypes:
model_hc_bic  = bn.structure_learning.fit(df, methodtype='hc', scoretype='bic')
model_hc_k2   = bn.structure_learning.fit(df, methodtype='hc', scoretype='k2')
model_hc_bdeu = bn.structure_learning.fit(df, methodtype='hc', scoretype='bdeu')
model_ex_bic  = bn.structure_learning.fit(df, methodtype='ex', scoretype='bic')
model_ex_k2   = bn.structure_learning.fit(df, methodtype='ex', scoretype='k2')
model_ex_bdeu = bn.structure_learning.fit(df, methodtype='ex', scoretype='bdeu')
model_cl      = bn.structure_learning.fit(df, methodtype='cl', root_node='Wet_Grass')

Example: Parameter Learning

# Import dataframe
df = bn.import_example()
# As an example we set the CPD at False which returns an "empty" DAG
model = bn.import_DAG('sprinkler', CPD=False)
# Now we learn the parameters of the DAG using the df
model_update = bn.parameter_learning.fit(model, df)
# Make plot
G = bn.plot(model_update)

Example: Inference

model = bn.import_DAG('sprinkler')
q_1 = bn.inference.fit(model, variables=['Rain'], evidence={'Cloudy':1,'Sprinkler':0, 'Wet_Grass':1})
q_2 = bn.inference.fit(model, variables=['Rain'], evidence={'Cloudy':1})

Example: Sampling to create dataframe

model = bn.import_DAG('sprinkler')
df = bn.sampling(model, n=1000)
  • Output of the model:
[bnlearn] Model correct: True
CPD of Cloudy:
+-----------+-----+
| Cloudy(0) | 0.5 |
+-----------+-----+
| Cloudy(1) | 0.5 |
+-----------+-----+
CPD of Sprinkler:
+--------------+-----------+-----------+
| Cloudy       | Cloudy(0) | Cloudy(1) |
+--------------+-----------+-----------+
| Sprinkler(0) | 0.5       | 0.9       |
+--------------+-----------+-----------+
| Sprinkler(1) | 0.5       | 0.1       |
+--------------+-----------+-----------+
CPD of Rain:
+---------+-----------+-----------+
| Cloudy  | Cloudy(0) | Cloudy(1) |
+---------+-----------+-----------+
| Rain(0) | 0.8       | 0.2       |
+---------+-----------+-----------+
| Rain(1) | 0.2       | 0.8       |
+---------+-----------+-----------+
CPD of Wet_Grass:
+--------------+--------------+--------------+--------------+--------------+
| Sprinkler    | Sprinkler(0) | Sprinkler(0) | Sprinkler(1) | Sprinkler(1) |
+--------------+--------------+--------------+--------------+--------------+
| Rain         | Rain(0)      | Rain(1)      | Rain(0)      | Rain(1)      |
+--------------+--------------+--------------+--------------+--------------+
| Wet_Grass(0) | 1.0          | 0.1          | 0.1          | 0.01         |
+--------------+--------------+--------------+--------------+--------------+
| Wet_Grass(1) | 0.0          | 0.9          | 0.9          | 0.99         |
+--------------+--------------+--------------+--------------+--------------+
[bnlearn] Nodes: ['Cloudy', 'Sprinkler', 'Rain', 'Wet_Grass']
[bnlearn] Edges: [('Cloudy', 'Sprinkler'), ('Cloudy', 'Rain'), ('Sprinkler', 'Wet_Grass'), ('Rain', 'Wet_Grass')]
[bnlearn] Independencies:
(Cloudy _|_ Wet_Grass | Rain, Sprinkler)
(Sprinkler _|_ Rain | Cloudy)
(Rain _|_ Sprinkler | Cloudy)
(Wet_Grass _|_ Cloudy | Rain, Sprinkler)

Example: Loading DAG from bif files

bif_file= 'sprinkler'
bif_file= 'alarm'
bif_file= 'andes'
bif_file= 'asia'
bif_file= 'pathfinder'
bif_file= 'sachs'
bif_file= 'miserables'
bif_file= 'filepath/to/model.bif'

# Loading example dataset
model = bn.import_DAG(bif_file)

Example: Comparing networks

# Load asia DAG
model = bn.import_DAG('asia')
# plot ground truth
G = bn.plot(model)
# Sampling
df = bn.sampling(model, n=10000)
# Structure learning of sampled dataset
model_sl = bn.structure_learning.fit(df, methodtype='hc', scoretype='bic')
# Plot based on structure learning of sampled data
bn.plot(model_sl, pos=G['pos'])
# Compare networks and make plot
bn.compare_networks(model, model_sl, pos=G['pos'])

Graph of ground truth

Graph based on Structure learning

Graph comparison ground truth vs. structure learning

Citation

Please cite bnlearn in your publications if this is useful for your research. Here is an example BibTeX entry:

@misc{erdogant2019bnlearn,
  title={bnlearn},
  author={Erdogan Taskesen},
  year={2019},
  howpublished={\url{https://github.com/erdogant/bnlearn}},
}

References

Maintainer

  • Erdogan Taskesen, github: erdogant
  • Contributions are welcome.
  • If you wish to buy me a Coffee for this work, it is very appreciated :)
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].