All Projects → StatisticalRethinkingJulia → Turingmodels.jl

StatisticalRethinkingJulia / Turingmodels.jl

Licence: other
Turing version of StatisticalRethinking models.

Programming Languages

julia
2034 projects

Labels

Projects that are alternatives of or similar to Turingmodels.jl

Bda py demos
Bayesian Data Analysis demos for Python
Stars: ✭ 781 (+696.94%)
Mutual labels:  mcmc
Emcee
The Python ensemble sampling toolkit for affine-invariant MCMC
Stars: ✭ 1,121 (+1043.88%)
Mutual labels:  mcmc
Bat.jl
A Bayesian Analysis Toolkit in Julia
Stars: ✭ 82 (-16.33%)
Mutual labels:  mcmc
Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+837.76%)
Mutual labels:  mcmc
Cmdstan.jl
CmdStan.jl v6 provides an alternative, older Julia wrapper to Stan's `cmdstan` executable.
Stars: ✭ 30 (-69.39%)
Mutual labels:  mcmc
Turing.jl
Bayesian inference with probabilistic programming.
Stars: ✭ 1,150 (+1073.47%)
Mutual labels:  mcmc
Rstan
RStan, the R interface to Stan
Stars: ✭ 760 (+675.51%)
Mutual labels:  mcmc
Nimble
The base NIMBLE package for R
Stars: ✭ 95 (-3.06%)
Mutual labels:  mcmc
Dblink
Distributed Bayesian Entity Resolution in Apache Spark
Stars: ✭ 38 (-61.22%)
Mutual labels:  mcmc
Getdist
MCMC sample analysis, kernel densities, plotting, and GUI
Stars: ✭ 80 (-18.37%)
Mutual labels:  mcmc
Pygtc
Make a sweet giant triangle confusogram (GTC) plot
Stars: ✭ 13 (-86.73%)
Mutual labels:  mcmc
Psgld
AAAI & CVPR 2016: Preconditioned Stochastic Gradient Langevin Dynamics (pSGLD)
Stars: ✭ 28 (-71.43%)
Mutual labels:  mcmc
Gerrychain
Use MCMC to analyze districting plans and gerrymanders
Stars: ✭ 68 (-30.61%)
Mutual labels:  mcmc
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+818.37%)
Mutual labels:  mcmc
Paramonte
ParaMonte: Plain Powerful Parallel Monte Carlo and MCMC Library for Python, MATLAB, Fortran, C++, C.
Stars: ✭ 88 (-10.2%)
Mutual labels:  mcmc
Boltzmann Machines
Boltzmann Machines in TensorFlow with examples
Stars: ✭ 768 (+683.67%)
Mutual labels:  mcmc
Bayesiantools
General-Purpose MCMC and SMC Samplers and Tools for Bayesian Statistics
Stars: ✭ 66 (-32.65%)
Mutual labels:  mcmc
Ggmcmc
Graphical tools for analyzing Markov Chain Monte Carlo simulations from Bayesian inference
Stars: ✭ 95 (-3.06%)
Mutual labels:  mcmc
Machine Learning Numpy
Gathers Machine learning models using pure Numpy to cover feed-forward, RNN, CNN, clustering, MCMC, timeseries, tree-based, and so much more!
Stars: ✭ 90 (-8.16%)
Mutual labels:  mcmc
Posterior
The posterior R package
Stars: ✭ 75 (-23.47%)
Mutual labels:  mcmc

TuringModels

CI

Introduction

This package contains Julia versions of the mcmc models contained in the R package "rethinking" associated with the book Statistical Rethinking by Richard McElreath. It is part of the StatisticalRethinkingJulia Github organization of packages.

This package implements the models using TuringLang/Turing.jl.

Usage

Most of the scripts and output can be inspected via the website. If you want to run the scripts yourselves, then you can either

  1. copy the code from the webpages and the data from this repository, and run the scripts or
  2. clone this repository and run one of the files in scripts. For example, julia --project -i scripts/basic-example.jl.

The scripts are written in Literate.jl to allow them to be ran stand-alone, and as part of the website. To generate the website locally, use Franklin.jl. Specifically, clone this repository and go into the root directory of this repository. Then, use

julia --project -ie 'using Franklin; Franklin.serve()'

This will activate the project environment (thanks to the --project flag) and will interactively execute Franklin.serve(). Interactively means that if serve fails, then you will still be in an active REPL session which avoids having to completely restart Julia. Building the site for the first time will take about 20 minutes. After building the site, it will be available on http://localhost:8000/. Consecutive calls to serve will only take a few minutes because Franklin caches the output.

Versions

v2.0

  • Show the output via Franklin.jl and Literate.jl
  • Simplify code
  • Use names for models instead of numbers
  • Fix multiple models
  • Let CI fail if an error occurs during build (to avoid manually having to check >20 webpages)
  • Update README

v1.1.2

  • CI matrix simplifications
  • NUTS(0.65) checking
  • Replace ifelse by comprehension
  • CI tests for more models

v1.1.1

  • CompatHelper updates

v1.0.6

  • CompatHelper updates

v1.0.5

  • CompatHelper updates
  • yiyuezhuo/ch13_models update

v1.0.4

  • CompatHelper updates

v1.0.3

  • Karajan9 patch
  • CompatHelper updates

v1.0.2

  • Relaxed Pkg upper bounds

v1.0.1

  • Model updates by Martin Trapp

v1.0.0

  • Set upper bounds in [compat] section of Project.toml
  • Activated CompatHelper (see CompatHelper.jl)
  • No longer uses Literate.jl. This version simply contains the models.
  • Some of the models are pretty slow.

v0.5

  • Based on capturing the documentation by Literate.
  • Literate.jl used to generate notebook versions

Acknowledgements

Richard Torkar has taken the lead in developing the Turing versions of the models in chapter 8 and subsequent chapters. Martin Trapp has updated many models to recent versions of Turing.jl. Rik Huijzer is bringing the models in sync with the 2nd edition of the StatisticalRethinking book in addition to several other improvements. Thibaut Lienart has given advise on how to use Franklin well.

The TuringLang team and #turing contributors on Slack have been extremely helpful! The Turing examples by Cameron Pfiffer and others have been a great help.

Questions and issues

Question and contributions are very welcome, as are feature requests and suggestions. Please open an issue if you encounter any problems or have a question.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].