All Projects → stanfordmlgroup → Ngboost

stanfordmlgroup / Ngboost

Licence: apache-2.0
Natural Gradient Boosting for Probabilistic Prediction

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Ngboost

decision-trees-for-ml
Building Decision Trees From Scratch In Python
Stars: ✭ 61 (-93.58%)
Mutual labels:  gradient-boosting
arboreto
A scalable python-based framework for gene regulatory network inference using tree-based ensemble regressors.
Stars: ✭ 33 (-96.53%)
Mutual labels:  gradient-boosting
Interpret
Fit interpretable models. Explain blackbox machine learning.
Stars: ✭ 4,352 (+358.11%)
Mutual labels:  gradient-boosting
cheapml
Machine Learning algorithms coded from scratch
Stars: ✭ 17 (-98.21%)
Mutual labels:  gradient-boosting
Apartment-Interest-Prediction
Predict people interest in renting specific NYC apartments. The challenge combines structured data, geolocalization, time data, free text and images.
Stars: ✭ 17 (-98.21%)
Mutual labels:  gradient-boosting
influence boosting
Supporting code for the paper "Finding Influential Training Samples for Gradient Boosted Decision Trees"
Stars: ✭ 57 (-94%)
Mutual labels:  gradient-boosting
Infiniteboost
InfiniteBoost: building infinite ensembles with gradient descent
Stars: ✭ 180 (-81.05%)
Mutual labels:  gradient-boosting
Awesome Gradient Boosting Papers
A curated list of gradient boosting research papers with implementations.
Stars: ✭ 704 (-25.89%)
Mutual labels:  gradient-boosting
autogbt-alt
An experimental Python package that reimplements AutoGBT using LightGBM and Optuna.
Stars: ✭ 76 (-92%)
Mutual labels:  gradient-boosting
Drishti
Real time eye tracking for embedded and mobile devices.
Stars: ✭ 325 (-65.79%)
Mutual labels:  gradient-boosting
orderbook modeling
Example of order book modeling.
Stars: ✭ 38 (-96%)
Mutual labels:  gradient-boosting
handson-ml
도서 "핸즈온 머신러닝"의 예제와 연습문제를 담은 주피터 노트북입니다.
Stars: ✭ 285 (-70%)
Mutual labels:  gradient-boosting
lleaves
Compiler for LightGBM gradient-boosted trees, based on LLVM. Speeds up prediction by ≥10x.
Stars: ✭ 132 (-86.11%)
Mutual labels:  gradient-boosting
stackgbm
🌳 Stacked Gradient Boosting Machines
Stars: ✭ 24 (-97.47%)
Mutual labels:  gradient-boosting
Gbdt simple tutorial
python实现GBDT的回归、二分类以及多分类,将算法流程详情进行展示解读并可视化,庖丁解牛地理解GBDT。Gradient Boosting Decision Trees regression, dichotomy and multi-classification are realized based on python, and the details of algorithm flow are displayed, interpreted and visualized to help readers better understand Gradient Boosting Decision
Stars: ✭ 503 (-47.05%)
Mutual labels:  gradient-boosting
Lightautoml
LAMA - automatic model creation framework
Stars: ✭ 196 (-79.37%)
Mutual labels:  gradient-boosting
yggdrasil-decision-forests
A collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.
Stars: ✭ 156 (-83.58%)
Mutual labels:  gradient-boosting
Awesome Fraud Detection Papers
A curated list of data mining papers about fraud detection.
Stars: ✭ 843 (-11.26%)
Mutual labels:  gradient-boosting
Hyperband
Tuning hyperparams fast with Hyperband
Stars: ✭ 555 (-41.58%)
Mutual labels:  gradient-boosting
Ensemble-Pytorch
A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
Stars: ✭ 407 (-57.16%)
Mutual labels:  gradient-boosting

NGBoost: Natural Gradient Boosting for Probabilistic Prediction

Python package Github License Code style: black

ngboost is a Python library that implements Natural Gradient Boosting, as described in "NGBoost: Natural Gradient Boosting for Probabilistic Prediction". It is built on top of Scikit-Learn, and is designed to be scalable and modular with respect to choice of proper scoring rule, distribution, and base learner. A didactic introduction to the methodology underlying NGBoost is available in this slide deck.

Installation

via pip

pip install --upgrade ngboost

via conda-forge

conda install -c conda-forge ngboost

Usage

Probabilistic regression example on the Boston housing dataset:

from ngboost import NGBRegressor

from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error

X, Y = load_boston(True)
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.2)

ngb = NGBRegressor().fit(X_train, Y_train)
Y_preds = ngb.predict(X_test)
Y_dists = ngb.pred_dist(X_test)

# test Mean Squared Error
test_MSE = mean_squared_error(Y_preds, Y_test)
print('Test MSE', test_MSE)

# test Negative Log Likelihood
test_NLL = -Y_dists.logpdf(Y_test).mean()
print('Test NLL', test_NLL)

Details on available distributions, scoring rules, learners, tuning, and model interpretation are available in our user guide, which also includes numerous usage examples and information on how to add new distributions or scores to NGBoost.

License

Apache License 2.0.

Reference

Tony Duan, Anand Avati, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Y. Ng, Alejandro Schuler. 2019. NGBoost: Natural Gradient Boosting for Probabilistic Prediction. arXiv

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].