All Projects → mj-will → nessai

mj-will / nessai

Licence: MIT License
nessai: Nested Sampling with Artificial Intelligence

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to nessai

anesthetic
Nested Sampling post-processing and plotting
Stars: ✭ 34 (+88.89%)
Mutual labels:  bayesian-inference, nested-sampling
flowtorch-old
Separating Normalizing Flows code from Pyro and improving API
Stars: ✭ 36 (+100%)
Mutual labels:  bayesian-inference, normalizing-flows
cpnest
Parallel nested sampling
Stars: ✭ 21 (+16.67%)
Mutual labels:  bayesian-inference, nested-sampling
nestle
Pure Python, MIT-licensed implementation of nested sampling algorithms for evaluating Bayesian evidence.
Stars: ✭ 66 (+266.67%)
Mutual labels:  bayesian-inference, nested-sampling
NestedSamplers.jl
Implementations of single and multi-ellipsoid nested sampling
Stars: ✭ 32 (+77.78%)
Mutual labels:  bayesian-inference, nested-sampling
ProbQA
Probabilistic question-asking system: the program asks, the users answer. The minimal goal of the program is to identify what the user needs (a target), even if the user is not aware of the existence of such a thing/product/service.
Stars: ✭ 43 (+138.89%)
Mutual labels:  bayesian-inference
brmstools
Helper functions for brmsfit objects (DEPRECATED)
Stars: ✭ 24 (+33.33%)
Mutual labels:  bayesian-inference
genstar
Generation of Synthetic Populations Library
Stars: ✭ 17 (-5.56%)
Mutual labels:  bayesian-inference
lgpr
R-package for interpretable nonparametric modeling of longitudinal data using additive Gaussian processes. Contains functionality for inferring covariate effects and assessing covariate relevances. Various models can be specified using a convenient formula syntax.
Stars: ✭ 22 (+22.22%)
Mutual labels:  bayesian-inference
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (+172.22%)
Mutual labels:  normalizing-flows
BayesHMM
Full Bayesian Inference for Hidden Markov Models
Stars: ✭ 35 (+94.44%)
Mutual labels:  bayesian-inference
UMNN
Implementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
Stars: ✭ 63 (+250%)
Mutual labels:  normalizing-flows
Birch
A probabilistic programming language that combines automatic differentiation, automatic marginalization, and automatic conditioning within Monte Carlo methods.
Stars: ✭ 80 (+344.44%)
Mutual labels:  bayesian-inference
FBNN
Code for "Functional variational Bayesian neural networks" (https://arxiv.org/abs/1903.05779)
Stars: ✭ 67 (+272.22%)
Mutual labels:  bayesian-inference
cosmopower
Machine Learning - accelerated Bayesian inference
Stars: ✭ 25 (+38.89%)
Mutual labels:  bayesian-inference
ZigZagBoomerang.jl
Sleek implementations of the ZigZag, Boomerang and other assorted piecewise deterministic Markov processes for Markov Chain Monte Carlo including Sticky PDMPs for variable selection
Stars: ✭ 58 (+222.22%)
Mutual labels:  bayesian-inference
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (+61.11%)
Mutual labels:  bayesian-inference
PlateFlex
Estimating effective elastic thickness of the lithosphere
Stars: ✭ 20 (+11.11%)
Mutual labels:  bayesian-inference
pyfilter
Particle filtering and sequential parameter inference in Python
Stars: ✭ 52 (+188.89%)
Mutual labels:  bayesian-inference
noisy-networks-measurements
Noisy network measurement with stan
Stars: ✭ 42 (+133.33%)
Mutual labels:  bayesian-inference

DOI PyPI Documentation Status tests int-tests codecov

nessai: Nested Sampling with Artificial Intelligence

nessai (/ˈnɛsi/): Nested Sampling with Artificial Intelligence

nessai is a nested sampling algorithm for Bayesian Inference that incorporates normalisings flows. It is designed for applications where the Bayesian likelihood is computationally expensive.

Installation

nessai can be installed using pip:

$ pip install nessai

Installing via conda is not currently supported.

PyTorch

By default the version of PyTorch will not necessarily match the drivers on your system, to install a different version with the correct CUDA support see the PyTorch homepage for instructions: https://pytorch.org/.

Using bilby

As of bilby version 1.1.0, nessai is now supported by default but it is still an optional requirement. See the bilby documentation for installation instructions for bilby

See the examples included with nessai for how to run nessai via bilby.

Documentation

Documentation is available at: nessai.readthedocs.io

Contributing

Please see the guidelines here.

Acknowledgements

The core nested sampling code, model design and code for computing the posterior in nessai was based on cpnest with permission from the authors.

The normalising flows implemented in nessai are all either directly imported from nflows or heavily based on it.

Other code snippets that draw on existing code reference the source in their corresponding doc-strings.

The authors also thank Laurence Datrier, Fergus Hayes and Jethro Linley for their feedback and help finding bugs in nessai.

Citing

If you find nessai useful in your work please cite the DOI for this code and our paper:

@software{nessai,
  author       = {Michael J. Williams},
  title        = {nessai: Nested Sampling with Artificial Intelligence},
  month        = feb,
  year         = 2021,
  publisher    = {Zenodo},
  version      = {latest},
  doi          = {10.5281/zenodo.4550693},
  url          = {https://doi.org/10.5281/zenodo.4550693}
}

@article{PhysRevD.103.103006,
  title = {Nested sampling with normalizing flows for gravitational-wave inference},
  author = {Williams, Michael J. and Veitch, John and Messenger, Chris},
  journal = {Phys. Rev. D},
  volume = {103},
  issue = {10},
  pages = {103006},
  numpages = {19},
  year = {2021},
  month = {May},
  publisher = {American Physical Society},
  doi = {10.1103/PhysRevD.103.103006},
  url = {https://link.aps.org/doi/10.1103/PhysRevD.103.103006}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].