All Projects → jpn-- → larch

jpn-- / larch

Licence: GPL-3.0 License
Larch: a Python tool for choice modeling

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
cython
566 projects
c
50402 projects - #5 most used programming language
C++
36643 projects - #6 most used programming language
Batchfile
5799 projects

larch

https://img.shields.io/conda/v/conda-forge/larch https://img.shields.io/conda/dn/conda-forge/larch https://img.shields.io/conda/l/conda-forge/larch

Larch: the logit architect

This is a tool for the estimation and application of logit-based discrete choice models. It is designed to integrate with NumPy and facilitate fast processing of linear models. If you want to estimate non-linear models, try Biogeme, which is more flexible in form and can be used for almost any model structure. If you don't know what the difference is, you probably want to start with linear models.

Larch is undergoing a transformation, with a new computational architecture that can significantly improve performance when working with large datasets. The new code relies on [numba](https://numba.pydata.org/), [xarray](https://xarray.pydata.org/en/stable/), and [sharrow](https://activitysim.github.io/sharrow) to enable super-fast estimation of choice models. Many (but not yet all) of the core features of Larch have been moved over to this new platform.

You can still use the old version of Larch as normal, but to try out the new version just import larch.numba instead of larch itself.

This project is very much under development. There are plenty of undocumented functions and features; use them at your own risk. Undocumented features may be non-functional, not rigorously tested, deprecated or removed without notice in a future version.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].