CitrineInformatics / lolo

Licence: Apache-2.0 license
A random forest

Programming Languages

scala
5932 projects
Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to lolo

cheapml
Machine Learning algorithms coded from scratch
Stars: ✭ 17 (-54.05%)
Mutual labels:  random-forest, machine-learning-algorithms, regression
onelearn
Online machine learning methods
Stars: ✭ 14 (-62.16%)
Mutual labels:  random-forest, regression
bitcoin-prediction
bitcoin prediction algorithms
Stars: ✭ 21 (-43.24%)
Mutual labels:  random-forest, machine-learning-algorithms
25daysinmachinelearning
I will update this repository to learn Machine learning with python with statistics content and materials
Stars: ✭ 53 (+43.24%)
Mutual labels:  random-forest, machine-learning-algorithms
Dynaml
Scala Library/REPL for Machine Learning Research
Stars: ✭ 195 (+427.03%)
Mutual labels:  machine-learning-algorithms, regression
R-stats-machine-learning
Misc Statistics and Machine Learning codes in R
Stars: ✭ 33 (-10.81%)
Mutual labels:  random-forest, regression
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+1294.59%)
Mutual labels:  random-forest, machine-learning-algorithms
Machine Learning Algorithms
A curated list of almost all machine learning algorithms and deep learning algorithms grouped by category.
Stars: ✭ 92 (+148.65%)
Mutual labels:  machine-learning-algorithms, regression
Machine Learning With Python
Practice and tutorial-style notebooks covering wide variety of machine learning techniques
Stars: ✭ 2,197 (+5837.84%)
Mutual labels:  random-forest, regression
Machine Learning Models
Decision Trees, Random Forest, Dynamic Time Warping, Naive Bayes, KNN, Linear Regression, Logistic Regression, Mixture Of Gaussian, Neural Network, PCA, SVD, Gaussian Naive Bayes, Fitting Data to Gaussian, K-Means
Stars: ✭ 160 (+332.43%)
Mutual labels:  random-forest, machine-learning-algorithms
Orange3
🍊 📊 💡 Orange: Interactive data analysis
Stars: ✭ 3,152 (+8418.92%)
Mutual labels:  random-forest, regression
Machine learning
Estudo e implementação dos principais algoritmos de Machine Learning em Jupyter Notebooks.
Stars: ✭ 161 (+335.14%)
Mutual labels:  machine-learning-algorithms, regression
Mlkit
A simple machine learning framework written in Swift 🤖
Stars: ✭ 144 (+289.19%)
Mutual labels:  machine-learning-algorithms, regression
regression-python
In this repository you can find many different, small, projects which demonstrate regression techniques using python programming language
Stars: ✭ 15 (-59.46%)
Mutual labels:  machine-learning-algorithms, regression
Machine Learning Concepts
Machine Learning Concepts with Concepts
Stars: ✭ 134 (+262.16%)
Mutual labels:  machine-learning-algorithms, regression
MachineLearningSeries
Vídeos e códigos do Universo Discreto ensinando o fundamental de Machine Learning em Python. Para mais detalhes, acompanhar a playlist listada.
Stars: ✭ 20 (-45.95%)
Mutual labels:  random-forest, machine-learning-algorithms
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+21251.35%)
Mutual labels:  machine-learning-algorithms, regression
Openml R
R package to interface with OpenML
Stars: ✭ 81 (+118.92%)
Mutual labels:  machine-learning-algorithms, regression
The Data Science Workshop
A New, Interactive Approach to Learning Data Science
Stars: ✭ 126 (+240.54%)
Mutual labels:  random-forest, regression
Machine-Learning-Algorithms
All Machine Learning Algorithms
Stars: ✭ 24 (-35.14%)
Mutual labels:  machine-learning-algorithms, regression

Lolo

Lolo

Travis

Lolo is a random forest-centered machine learning library in Scala.

The core of Lolo is bagging simple base learners, like decision trees, to produce models that can generate robust uncertainty estimates.

Lolo supports:

  • continuous and categorical features
  • regression, classification, and multi-task trees
  • bagged learners to produce ensemble models, e.g. random forests
  • linear and ridge regression
  • regression leaf models, e.g. ridge regression trained on the leaf data
  • random rotation ensembles
  • recalibrated bootstrap prediction interval estimates
  • bias-corrected jackknife-after-bootstrap and infinitesimal jackknife confidence interval estimates
  • bias models trained on out-of-bag residuals
  • feature importances computed via variance reduction or Shapley values (which are additive and per-prediction)
  • model based feature importance
  • distance correlation
  • hyperparameter optimization via grid or random search
  • parallel training via scala parallel collections
  • validation metrics for accuracy and uncertainty quantification
  • visualization of predicted-vs-actual validations
  • deterministic training via random seeds

Usage

Lolo is on the central repository, and can be used by simply adding the following dependency block in your pom file:

<dependency>
    <groupId>io.citrine</groupId>
    <artifactId>lolo</artifactId>
    <version>6.0.0</version>
</dependency>

Lolo provides higher level wrappers for common learner combinations. For example, you can use Random Forest with:

import io.citrine.lolo.learners.RandomForestRegressor
val trainingData: Seq[TrainingRow[Double]] = TrainingRow.build(features.zip(labels))
val model = RandomForestRegressor().train(trainingData).model
val predictions: Seq[Double] = model.transform(testInputs).expected

Performance

Lolo prioritizes functionality over performance, but it is still quite fast. In its random forest use case, the complexity scales as:

Time complexity Training rows Features Trees
train O(n log n) O(n) O(n)
loss O(n log n) O(n) O(n)
expected O(log n) O(1) O(n)
uncertainty O(n) O(1) O(n)

On an Ivy Bridge test platform, the (1024 row, 1024 tree, 8 feature) performance test took 1.4 sec to train and 2.3 ms per prediction with uncertainty.

Contributing

We welcome bug reports, feature requests, and pull requests. Pull requests should be made following the feature branch workflow: branching off of and opening PRs into main.

Production releases are triggered by tags. The sbt-ci-release plugin will use the tag as the lolo version. On the other hand, lolopy versions are still read from setup.py, so version bumps are needed for successful releases. Failing to bump the lolopy version number will result in a skipped lolopy release rather than a build failure.

Code Formatting

  • Consistent formatting is enforced by scalafmt.
  • The easiest way to check whether scalafmt is satisfied is to run scalafmt from the command line: sbt scalafmtCheckAll. This will check whether any files need to be reformatted. Pull requests are gated on this running successfully. You can automatically check whether code is formatted properly before pushing to an upstream repository using a git hook. To set this up, install the pre-commit framework by following the instructions here. Then enable the hooks in .pre-commit-config.yaml by running pre-commit install --hook-type pre-push from the root directory. This will run scalafmtCheckAll before pushing to a remote repo.
  • To ensure code is formatted properly, you can run sbt scalafmtAll from the command line or configure your IDE to format files on save.

Authors

See Contributors

Related projects

  • randomForestCI is an R-based implementation of jackknife variance estimates by S. Wager
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].