All Projects → mcosovic → FactorGraph.jl

mcosovic / FactorGraph.jl

Licence: MIT license
The FactorGraph package provides the set of different functions to perform inference over the factor graph with continuous or discrete random variables using the belief propagation algorithm.

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to FactorGraph.jl

Belief-Propagation
Overview and implementation of Belief Propagation and Loopy Belief Propagation algorithms: sum-product, max-product, max-sum
Stars: ✭ 85 (+400%)
Mutual labels:  message-passing, sum-product, belief-propagation, factor-graph, loopy-belief-propagation
fglib
factor graph library
Stars: ✭ 53 (+211.76%)
Mutual labels:  message-passing, sum-product, belief-propagation, factor-graph
codac
Codac is a library for constraint programming over reals, trajectories and sets.
Stars: ✭ 31 (+82.35%)
Mutual labels:  dynamical-systems, state-estimation
LGNpy
Linear Gaussian Bayesian Networks - Inference, Parameter Learning and Representation. 🖧
Stars: ✭ 25 (+47.06%)
Mutual labels:  message-passing, belief-propagation
graphchem
Graph-based machine learning for chemical property prediction
Stars: ✭ 21 (+23.53%)
Mutual labels:  message-passing
AVP-SLAM-PLUS
An implementation of AVP-SLAM and some new contributions
Stars: ✭ 371 (+2082.35%)
Mutual labels:  state-estimation
aether
Distributed system emulation in Common Lisp
Stars: ✭ 19 (+11.76%)
Mutual labels:  message-passing
BifurcationInference.jl
learning state-space targets in dynamical systems
Stars: ✭ 24 (+41.18%)
Mutual labels:  dynamical-systems
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+870.59%)
Mutual labels:  message-passing
sysidentpy
A Python Package For System Identification Using NARMAX Models
Stars: ✭ 139 (+717.65%)
Mutual labels:  dynamical-systems
scikit tt
Tensor Train Toolbox
Stars: ✭ 52 (+205.88%)
Mutual labels:  dynamical-systems
RigidBodySim.jl
Simulation and visualization of articulated rigid body systems in Julia
Stars: ✭ 63 (+270.59%)
Mutual labels:  dynamical-systems
eTrust
Source code and dataset for TKDE 2019 paper “Trust Relationship Prediction in Alibaba E-Commerce Platform”
Stars: ✭ 14 (-17.65%)
Mutual labels:  factor-graph
ReactiveMP.jl
Julia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (+241.18%)
Mutual labels:  message-passing
Active-Passive-Losses
[ICML2020] Normalized Loss Functions for Deep Learning with Noisy Labels
Stars: ✭ 92 (+441.18%)
Mutual labels:  noisy-data
chaotic-maps
Simple implementations of chaotic maps in Processing
Stars: ✭ 18 (+5.88%)
Mutual labels:  dynamical-systems
PySPOD
A Python package for spectral proper orthogonal decomposition (SPOD).
Stars: ✭ 50 (+194.12%)
Mutual labels:  dynamical-systems
pyUKF
Unscented kalman filter (UKF) library in python that supports multiple measurement updates
Stars: ✭ 52 (+205.88%)
Mutual labels:  state-estimation
godsend
A simple and eloquent workflow for streaming messages to micro-services.
Stars: ✭ 15 (-11.76%)
Mutual labels:  message-passing
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+158.82%)
Mutual labels:  message-passing

FactorGraph

Documentation Build

FactorGraph is an open-source, easy-to-use simulation tool/solver for researchers and educators provided as a Julia package, with source code released under MIT License. The FactorGraph package provides the set of different functions to perform inference over the factor graph with continuous or discrete random variables using the belief propagation (BP) algorithm, also known as the sum-product algorithm.

We have tested and verified simulation tool using different scenarios to the best of our ability. As a user of this simulation tool, you can help us to improve future versions, we highly appreciate your feedback about any errors, inaccuracies, and bugs. For more information, please visit documentation site.


Requirement

FactorGraph requires Julia 1.6 and higher.


Installation

To install the FactorGraph package, run the following command:

pkg> add FactorGraph

To use FactorGraph package, add the following code to your script, or alternatively run the same command in Julia REPL:

using FactorGraph

Quick start whitin continuous framework

The following examples are intended for a quick introduction to FactorGraph package within the continuous framework.

  • The broadcast GBP algorithm:
using FactorGraph

H = [1.0 0.0 0.0; 1.5 0.0 2.0; 0.0 3.1 4.6] # coefficient matrix
z = [0.5; 0.8; 4.1]                         # observation vector
v = [0.1; 1.0; 1.0]                         # variance vector

gbp = continuousModel(H, z, v)              # initialize the graphical model
for iteration = 1:50                        # the GBP inference
    messageFactorVariableBroadcast(gbp)     # compute messages using the broadcast GBP
    messageVariableFactorBroadcast(gbp)     # compute messages using the broadcast GBP
end
marginal(gbp)                               # compute marginals
  • The vanilla GBP algorithm in the dynamic framework:
using FactorGraph

H = [1.0 0.0 0.0; 1.5 0.0 2.0; 0.0 3.1 4.6] # coefficient matrix
z = [0.5; 0.8; 4.1]                         # observation vector
v = [0.1; 1.0; 1.0]                         # variance vector

gbp = continuousModel(H, z, v)              # initialize the graphical model
for iteration = 1:200                       # the GBP inference
    messageFactorVariable(gbp)              # compute messages using the vanilla GBP
    messageVariableFactor(gbp)              # compute messages using the vanilla GBP
end

dynamicFactor!(gbp;                         # integrate changes in the running GBP
    factor = 1,
    observation = 0.85,
    variance = 1e-10)
for iteration = 201:400                     # continues the GBP inference
    messageFactorVariable(gbp)              # compute messages using the vanilla GBP
    messageVariableFactor(gbp)              # compute messages using the vanilla GBP
end
marginal(gbp)                               # compute marginals
  • The vanilla GBP algorithm in the ageing framework:
using FactorGraph

H = [1.0 0.0 0.0; 1.5 0.0 2.0; 0.0 3.1 4.6] # coefficient matrix
z = [0.5; 0.8; 4.1]                         # observation vector
v = [0.1; 1.0; 1.0]                         # variance vector

gbp = continuousModel(H, z, v)              # initialize the graphical model
for iteration = 1:200                       # the GBP inference
    messageFactorVariable(gbp)              # compute messages using the vanilla GBP
    messageVariableFactor(gbp)              # compute messages using the vanilla GBP
end

for iteration = 1:400                       # continues the GBP inference
    ageingVariance!(gbp;                    # integrate changes in the running GBP
        factor = 3,
        initial = 1,
        limit = 50,
        model = 1,
        a = 0.05,
        tau = iteration)
    messageFactorVariable(gbp)              # compute messages using the vanilla GBP
    messageVariableFactor(gbp)              # compute messages using the vanilla GBP
end
marginal(gbp)                               # compute marginals
  • The forward–backward GBP algorithm over the tree factor graph:
using FactorGraph

H = [1 0 0 0 0; 6 8 2 0 0; 0 5 0 0 0;       # coefficient matrix
     0 0 2 0 0; 0 0 3 8 2]
z = [1; 2; 3; 4; 5]                         # observation vector
v = [3; 4; 2; 5; 1]                         # variance vector

gbp = continuousTreeModel(H, z, v)          # initialize the tree graphical model
while gbp.graph.forward                     # inference from leaves to the root
     forwardVariableFactor(gbp)             # compute forward messages
     forwardFactorVariable(gbp)             # compute forward messages
end
while gbp.graph.backward                    # inference from the root to leaves
     backwardVariableFactor(gbp)            # compute backward messages
     backwardFactorVariable(gbp)            # compute backward messages
end
marginal(gbp)                               # compute marginals

Quick start whitin discrete framework

Following example is intended for a quick introduction to FactorGraph package within the discrete framework.

  • The forward–backward BP algorithm over the tree factor graph:
using FactorGraph

probability1 = [1]
table1 = [0.2; 0.3; 0.4; 0.1]

probability2 = [1; 2; 3]
table2 = zeros(4, 3, 1)
table2[1, 1, 1] = 0.2; table2[2, 1, 1] = 0.5; table2[3, 1, 1] = 0.3; table2[4, 1, 1] = 0.0
table2[1, 2, 1] = 0.1; table2[2, 2, 1] = 0.1; table2[3, 2, 1] = 0.7; table2[4, 2, 1] = 0.1
table2[1, 3, 1] = 0.5; table2[2, 3, 1] = 0.2; table2[3, 3, 1] = 0.1; table2[4, 3, 1] = 0.1

probability = [probability1, probability2]
table = [table1, table2]

bp = discreteTreeModel(probability, table)  # initialize the tree graphical model
while bp.graph.forward                      # inference from leaves to the root
    forwardVariableFactor(bp)               # compute forward messages
    forwardFactorVariable(bp)               # compute forward messages
end
while bp.graph.backward                     # inference from the root to leaves
    backwardVariableFactor(bp)              # compute backward messages
    backwardFactorVariable(bp)              # compute backward messages
end
marginal(bp)                                # compute normalized marginals
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].