All Projects → dmitryikh → Leaves

dmitryikh / Leaves

Licence: mit
pure Go implementation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks

Programming Languages

go
31211 projects - #10 most used programming language
golang
3204 projects

Projects that are alternatives of or similar to Leaves

aws-machine-learning-university-dte
Machine Learning University: Decision Trees and Ensemble Methods
Stars: ✭ 119 (-54.41%)
Mutual labels:  xgboost, lightgbm, decision-trees
stackgbm
🌳 Stacked Gradient Boosting Machines
Stars: ✭ 24 (-90.8%)
Mutual labels:  xgboost, lightgbm, decision-trees
Adversarial Robustness Toolbox
Adversarial Robustness Toolbox (ART) - Python Library for Machine Learning Security - Evasion, Poisoning, Extraction, Inference - Red and Blue Teams
Stars: ✭ 2,638 (+910.73%)
Mutual labels:  xgboost, lightgbm, decision-trees
HousePrice
住房月租金预测大数据赛TOP1
Stars: ✭ 17 (-93.49%)
Mutual labels:  xgboost, lightgbm
Arch-Data-Science
Archlinux PKGBUILDs for Data Science, Machine Learning, Deep Learning, NLP and Computer Vision
Stars: ✭ 92 (-64.75%)
Mutual labels:  xgboost, lightgbm
AutoTabular
Automatic machine learning for tabular data. ⚡🔥⚡
Stars: ✭ 51 (-80.46%)
Mutual labels:  xgboost, lightgbm
mlforecast
Scalable machine 🤖 learning for time series forecasting.
Stars: ✭ 96 (-63.22%)
Mutual labels:  xgboost, lightgbm
datascienv
datascienv is package that helps you to setup your environment in single line of code with all dependency and it is also include pyforest that provide single line of import all required ml libraries
Stars: ✭ 53 (-79.69%)
Mutual labels:  xgboost, lightgbm
Apartment-Interest-Prediction
Predict people interest in renting specific NYC apartments. The challenge combines structured data, geolocalization, time data, free text and images.
Stars: ✭ 17 (-93.49%)
Mutual labels:  xgboost, lightgbm
Tencent2017 Final Rank28 code
2017第一届腾讯社交广告高校算法大赛Rank28_code
Stars: ✭ 85 (-67.43%)
Mutual labels:  xgboost, lightgbm
Kaggle-Competition-Sberbank
Top 1% rankings (22/3270) code sharing for Kaggle competition Sberbank Russian Housing Market: https://www.kaggle.com/c/sberbank-russian-housing-market
Stars: ✭ 31 (-88.12%)
Mutual labels:  xgboost, lightgbm
xgboost-lightgbm-hyperparameter-tuning
Bayesian Optimization and Grid Search for xgboost/lightgbm
Stars: ✭ 40 (-84.67%)
Mutual labels:  xgboost, lightgbm
HyperGBM
A full pipeline AutoML tool for tabular data
Stars: ✭ 172 (-34.1%)
Mutual labels:  xgboost, lightgbm
Kaggle
Kaggle Kernels (Python, R, Jupyter Notebooks)
Stars: ✭ 26 (-90.04%)
Mutual labels:  xgboost, lightgbm
RobustTrees
[ICML 2019, 20 min long talk] Robust Decision Trees Against Adversarial Examples
Stars: ✭ 62 (-76.25%)
Mutual labels:  xgboost, decision-trees
MSDS696-Masters-Final-Project
Earthquake Prediction Challenge with LightGBM and XGBoost
Stars: ✭ 58 (-77.78%)
Mutual labels:  xgboost, lightgbm
neptune-client
📒 Experiment tracking tool and model registry
Stars: ✭ 348 (+33.33%)
Mutual labels:  xgboost, lightgbm
JLBoost.jl
A 100%-Julia implementation of Gradient-Boosting Regression Tree algorithms
Stars: ✭ 65 (-75.1%)
Mutual labels:  xgboost, lightgbm
ai-deployment
关注AI模型上线、模型部署
Stars: ✭ 149 (-42.91%)
Mutual labels:  xgboost, lightgbm
recsys2019
The complete code and notebooks used for the ACM Recommender Systems Challenge 2019
Stars: ✭ 26 (-90.04%)
Mutual labels:  xgboost, lightgbm

leaves

version Build Status GoDoc Coverage Status Go Report Card

Logo

Introduction

leaves is a library implementing prediction code for GBRT (Gradient Boosting Regression Trees) models in pure Go. The goal of the project - make it possible to use models from popular GBRT frameworks in Go programs without C API bindings.

NOTE: Before 1.0.0 release the API is a subject to change.

Features

  • General Features:
    • support parallel predictions for batches
    • support sigmoid, softmax transformation functions
    • support getting leaf indices of decision trees
  • Support LightGBM (repo) models:
    • read models from text format and from JSON format
    • support gbdt, rf (random forest) and dart models
    • support multiclass predictions
    • addition optimizations for categorical features (for example, one hot decision rule)
    • addition optimizations exploiting only prediction usage
  • Support XGBoost (repo) models:
    • read models from binary format
    • support gbtree, gblinear, dart models
    • support multiclass predictions
    • support missing values (nan)
  • Support scikit-learn (repo) tree models (experimental support):
    • read models from pickle format (protocol 0)
    • support sklearn.ensemble.GradientBoostingClassifier

Usage examples

In order to start, go get this repository:

go get github.com/dmitryikh/leaves

Minimal example:

package main

import (
	"fmt"

	"github.com/dmitryikh/leaves"
)

func main() {
	// 1. Read model
	useTransformation := true
	model, err := leaves.LGEnsembleFromFile("lightgbm_model.txt", useTransformation)
	if err != nil {
		panic(err)
	}

	// 2. Do predictions!
	fvals := []float64{1.0, 2.0, 3.0}
	p := model.PredictSingle(fvals, 0)
	fmt.Printf("Prediction for %v: %f\n", fvals, p)
}

In order to use XGBoost model, just change leaves.LGEnsembleFromFile, to leaves.XGEnsembleFromFile.

Documentation

Documentation is hosted on godoc (link). Documentation contains complex usage examples and full API reference. Some additional information about usage examples can be found in leaves_test.go.

Compatibility

Most leaves features are tested to be compatible with old and coming versions of GBRT libraries. In compatibility.md one can found detailed report about leaves correctness against different versions of external GBRT libraries.

Some additional information on new features and backward compatibility can be found in NOTES.md.

Benchmark

Below are comparisons of prediction speed on batches (~1000 objects in 1 API call). Hardware: MacBook Pro (15-inch, 2017), 2,9 GHz Intel Core i7, 16 ГБ 2133 MHz LPDDR3. C API implementations were called from python bindings. But large batch size should neglect overhead of python bindings. leaves benchmarks were run by means of golang test framework: go test -bench. See benchmark for mode details on measurments. See testdata/README.md for data preparation pipelines.

Single thread:

Test Case Features Trees Batch size C API leaves
LightGBM MS LTR 137 500 1000 49ms 51ms
LightGBM Higgs 28 500 1000 50ms 50ms
LightGBM KDD Cup 99* 41 1200 1000 70ms 85ms
XGBoost Higgs 28 500 1000 44ms 50ms

4 threads:

Test Case Features Trees Batch size C API leaves
LightGBM MS LTR 137 500 1000 14ms 14ms
LightGBM Higgs 28 500 1000 14ms 14ms
LightGBM KDD Cup 99* 41 1200 1000 19ms 24ms
XGBoost Higgs 28 500 1000 ? 14ms

(?) - currenly I'm unable to utilize multithreading form XGBoost predictions by means of python bindings

(*) - KDD Cup 99 problem involves continuous and categorical features simultaneously

Limitations

  • LightGBM models:
    • limited support of transformation functions (support only sigmoid, softmax)
  • XGBoost models:
    • limited support of transformation functions (support only sigmoid, softmax)
    • could be slight divergence between C API predictions vs. leaves because of floating point convertions and comparisons tolerances
  • scikit-learn tree models:
    • no support transformations functions. Output scores is raw scores (as from GradientBoostingClassifier.decision_function)
    • only pickle protocol 0 is supported
    • could be slight divergence between sklearn predictions vs. leaves because of floating point convertions and comparisons tolerances

Contacts

In case if you are interested in the project or if you have questions, please contact with me by email: khdmitryi at gmail.com

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].