All Projects → phylovi → bito

phylovi / bito

Licence: GPL-3.0 license
Python-interface C++ library for Bayesian phylogenetics via optimization

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
Turing
15 projects
CMake
9771 projects
Yacc
648 projects
LLVM
166 projects

bito

run tests

bito, or "Bayesian Inference of Trees via Optimization", is a Python-interface C++ library for phylogenetic variational inference so that you can express interesting parts of your phylogenetic model in Python/TensorFlow/PyTorch/etc and let bito handle the tree structure and likelihood computations for you. "Bito" is also the name of a tree native to Africa that produces medicinal oil. We pronounce "bito" with a long /e/ sound ("bito" rhymes with "burrito").

This library is in an experimental state. This library was formerly known as "libsbn".

Dependencies

  • If you are on linux, install gcc >= 7.5, which is standard in Debian Buster and Ubuntu 18.04
  • If you are on OS X, use a recent version of Xcode and install command line tools

We suggest using anaconda and the associated conda environment file, which will nicely install relevant dependencies:

conda env create -f environment.yml
conda activate bito

(Very optional) The notebooks require R, IRKernel, rpy2 >=3.1.0, and some R packages such as ggplot and cowplot.

Building

For your first build, do

  • git submodule update --init --recursive
  • make

This will install the bito Python module.

You can build and run tests using make test and make fasttest (the latter excludes some slow tests).

Note that make accepts -j flags for multi-core builds: e.g. -j20 will build with 20 jobs.

  • (Optional) If you modify the lexer and parser, call make bison. This assumes that you have installed Bison >= 3.4 (conda install -c conda-forge bison).
  • (Optional) If you modify the test preparation scripts, call make prep. This assumes that you have installed ete3 (conda install -c etetoolkit ete3).

Understanding

The following two papers will explain what this repository is about:

Our documentation consists of:

Contributing

We welcome your contributions! Please see our detailed contribution guidelines.

Contributors

  • Erick Matsen (@matsen): implementation, design, janitorial duties
  • Dave H. Rich (@DaveRich): core developer
  • Ognian Milanov (@ognian-): core developer
  • Mathieu Fourment (@4ment): implementation of substitution models and likelihoods/gradients, design
  • Seong-Hwan Jun (@junseonghwan): generalized pruning design and implementation, implementation of SBN gradients, design
  • Hassan Nasif (@hrnasif): hot start for generalized pruning; gradient descent for generalized pruning
  • Anna Kooperberg (@annakooperberg): refactoring the subsplit DAG
  • Sho Kiami (@shokiami): refactoring the subsplit DAG
  • Tanvi Ganapathy (@tanviganapathy): refactoring the subsplit DAG
  • Lucy Yang (@lucyyang01): subsplit DAG visualization
  • Cheng Zhang (@zcrabbit): concept, design, algorithms
  • Christiaan Swanepoel (@christiaanjs): design
  • Xiang Ji (@xji3): gradient expertise and node height code
  • Marc Suchard (@msuchard): gradient expertise and node height code
  • Michael Karcher (@mdkarcher): SBN expertise
  • Eric J. Isaac (@EricJIsaac): C++ wisdom

Citations

If you are citing this library, please cite the NeurIPS and ICLR papers listed above. We require BEAGLE, so please also cite these papers:

Acknowledgements

  • Jaime Huerta-Cepas: several tree traversal functions are copied from ete3
  • Thomas Junier: parts of the parser are copied from newick_utils
  • The parser driver is derived from the Bison C++ example

In addition to the packages mentioned above we also employ:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].