All Projects → StatMixedML → XGBoostLSS

StatMixedML / XGBoostLSS

Licence: Apache-2.0 License
An extension of XGBoost to probabilistic forecasting

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to XGBoostLSS

piven
Official implementation of the paper "PIVEN: A Deep Neural Network for Prediction Intervals with Specific Value Prediction" by Eli Simhayev, Gilad Katz and Lior Rokach
Stars: ✭ 26 (-85.71%)
Mutual labels:  uncertainty-estimation, prediction-intervals
target-and-market
A data-driven tool to identify the best candidates for a marketing campaign and optimize it.
Stars: ✭ 19 (-89.56%)
Mutual labels:  xgboost
kserve
Serverless Inferencing on Kubernetes
Stars: ✭ 1,621 (+790.66%)
Mutual labels:  xgboost
Kaggle-Competition-Sberbank
Top 1% rankings (22/3270) code sharing for Kaggle competition Sberbank Russian Housing Market: https://www.kaggle.com/c/sberbank-russian-housing-market
Stars: ✭ 31 (-82.97%)
Mutual labels:  xgboost
Apartment-Interest-Prediction
Predict people interest in renting specific NYC apartments. The challenge combines structured data, geolocalization, time data, free text and images.
Stars: ✭ 17 (-90.66%)
Mutual labels:  xgboost
xgboost-lightgbm-hyperparameter-tuning
Bayesian Optimization and Grid Search for xgboost/lightgbm
Stars: ✭ 40 (-78.02%)
Mutual labels:  xgboost
mloperator
Machine Learning Operator & Controller for Kubernetes
Stars: ✭ 85 (-53.3%)
Mutual labels:  xgboost
Calibrated-Boosting-Forest
Original implementation of Calibrated Boosting-Forest
Stars: ✭ 18 (-90.11%)
Mutual labels:  xgboost
kaggle-code
A repository for some of the code I used in kaggle data science & machine learning tasks.
Stars: ✭ 100 (-45.05%)
Mutual labels:  xgboost
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (-84.07%)
Mutual labels:  uncertainty-estimation
kaggle getting started
Kaggle getting started competition examples
Stars: ✭ 18 (-90.11%)
Mutual labels:  xgboost
aws-machine-learning-university-dte
Machine Learning University: Decision Trees and Ensemble Methods
Stars: ✭ 119 (-34.62%)
Mutual labels:  xgboost
featurewiz
Use advanced feature engineering strategies and select best features from your data set with a single line of code.
Stars: ✭ 229 (+25.82%)
Mutual labels:  xgboost
ai-deployment
关注AI模型上线、模型部署
Stars: ✭ 149 (-18.13%)
Mutual labels:  xgboost
secure-xgboost
Secure collaborative training and inference for XGBoost.
Stars: ✭ 80 (-56.04%)
Mutual labels:  xgboost
Automatic-Stock-Trading
Trading Algorithm by XGBoost
Stars: ✭ 58 (-68.13%)
Mutual labels:  xgboost
Tencent2017 Final Rank28 code
2017第一届腾讯社交广告高校算法大赛Rank28_code
Stars: ✭ 85 (-53.3%)
Mutual labels:  xgboost
HumanOrRobot
a solution for competition of kaggle `Human or Robot`
Stars: ✭ 16 (-91.21%)
Mutual labels:  xgboost
Arch-Data-Science
Archlinux PKGBUILDs for Data Science, Machine Learning, Deep Learning, NLP and Computer Vision
Stars: ✭ 92 (-49.45%)
Mutual labels:  xgboost
tensorflow kaggle house price
[Done] Master version: developed the stacked regression (score 0.11, top 5%) based on (xgboost, sklearn). Branch v1.0: developed linear regression (score 0.45) based on Tensorflow
Stars: ✭ 25 (-86.26%)
Mutual labels:  xgboost

XGBoostLSS - An extension of XGBoost to probabilistic forecasting

We propose a new framework of XGBoost that predicts the entire conditional distribution of a univariate response variable. In particular, XGBoostLSS models all moments of a parametric distribution, i.e., mean, location, scale and shape (LSS), instead of the conditional mean only. Choosing from a wide range of continuous, discrete, and mixed discrete-continuous distribution, modelling and predicting the entire conditional distribution greatly enhances the flexibility of XGBoost, as it allows to create probabilistic forecasts from which prediction intervals and quantiles of interest can be derived.

News

💥 [2022-01-03] XGBoostLSS now supports estimation of the Gamma distribution.
💥 [2021-12-22] XGBoostLSS now supports estimating the full predictive distribution via Expectile Regression.
💥 [2021-12-20] XGBoostLSS is initialized with suitable starting values to improve convergence of estimation.
💥 [2021-12-04] XGBoostLSS now supports automatic derivation of Gradients and Hessians.
💥 [2021-12-02] XGBoostLSS now supports pruning during hyperparameter optimization.
💥 [2021-11-14] XGBoostLSS v0.1.0 is released!

Features

Simultaneous updating of all distributional parameters.
Automatic derivation of Gradients and Hessian of all distributional parameters using PyTorch.
Automated hyper-parameter search, including pruning, is done via Optuna.
The output of XGBoostLSS is explained using SHapley Additive exPlanations.
XGBoostLSS is available in Python.

Work in Progress

🚧 Functions that facilitates the choice and evaluation of a candidate distribution (e.g., quantile residual plots, ...).
🚧 Calling XGBoostLSS from R via the reticulate package.
🚧 Estimation of full predictive distribution without relying on a distributional assumption.

Available Distributions

Currently, XGBoostLSS supports the following distributions. More continuous distributions, as well as discrete, mixed discrete-continuous and zero-inflated distributions are to come soon.

Some Notes

Stabilization

Since XGBoostLSS updates the parameter estimates by optimizing Gradients and Hessians, it is important that these are comparable in magnitude for all distributional parameters. Due to variability regarding the ranges, the estimation of Gradients and Hessians might become unstable so that XGBoostLSS might not converge or might converge very slowly. To mitigate these effects, we have implemented a stabilization of Gradients and Hessians.

An additional option to improve convergence can be to standardize the (continuous) response variable, e.g., y/100. This is especially useful if the range of the response differs strongly from the range of Gradients and Hessians. Both, the in-built stabilization, and the standardization of the response need to be carefully considered given the data at hand.

Runtime

Since XGBoostLSS updates all distributional parameters simultaneously, it requires training [number of iterations] * [number of distributional parameters] trees. Hence, the runtime of XGBoostLSS is generally slightly higher as compared to XGBoost, which requires training [number of iterations] trees only.

Feedback

Please provide feedback on how to improve XGBoostLSS, or if you request additional distributions to be implemented, by opening a new issue.

Installation

$ pip install git+https://github.com/StatMixedML/XGBoostLSS.git

How to use

We refer to the examples section for example notebooks.

Reference Paper

März, Alexander (2019) "XGBoostLSS - An extension of XGBoost to probabilistic forecasting".

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].