All Projects → dmlc → Xgboost

dmlc / Xgboost

Licence: apache-2.0
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
Cuda
1817 projects
scala
5932 projects
r
7636 projects
java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to Xgboost

RobustTrees
[ICML 2019, 20 min long talk] Robust Decision Trees Against Adversarial Examples
Stars: ✭ 62 (-99.72%)
Mutual labels:  xgboost, gbdt, gbm, gbrt
fast retraining
Show how to perform fast retraining with LightGBM in different business cases
Stars: ✭ 56 (-99.75%)
Mutual labels:  xgboost, gbdt, gbm, gbrt
JLBoost.jl
A 100%-Julia implementation of Gradient-Boosting Regression Tree algorithms
Stars: ✭ 65 (-99.7%)
Mutual labels:  xgboost, gbdt, gbrt
stackgbm
🌳 Stacked Gradient Boosting Machines
Stars: ✭ 24 (-99.89%)
Mutual labels:  xgboost, gbdt, gbm
Lightgbm
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
Stars: ✭ 13,293 (-39.62%)
Mutual labels:  gbdt, gbm, gbrt
sagemaker-xgboost-container
This is the Docker container based on open source framework XGBoost (https://xgboost.readthedocs.io/en/latest/) to allow customers use their own XGBoost scripts in SageMaker.
Stars: ✭ 93 (-99.58%)
Mutual labels:  xgboost, gbm
decision-trees-for-ml
Building Decision Trees From Scratch In Python
Stars: ✭ 61 (-99.72%)
Mutual labels:  xgboost, gbm
HyperGBM
A full pipeline AutoML tool for tabular data
Stars: ✭ 172 (-99.22%)
Mutual labels:  xgboost, gbm
Ergo
a Framework for creating mesh networks using technologies and design patterns of Erlang/OTP in Golang
Stars: ✭ 376 (-98.29%)
Mutual labels:  distributed-systems
Tendermint
⟁ Tendermint Core (BFT Consensus) in Go
Stars: ✭ 4,491 (-79.6%)
Mutual labels:  distributed-systems
Cothority
Scalable collective authority
Stars: ✭ 372 (-98.31%)
Mutual labels:  distributed-systems
Nsq
A realtime distributed messaging platform
Stars: ✭ 20,663 (-6.15%)
Mutual labels:  distributed-systems
Featran
A Scala feature transformation library for data science and machine learning
Stars: ✭ 420 (-98.09%)
Mutual labels:  xgboost
Dsync
A distributed sync package.
Stars: ✭ 377 (-98.29%)
Mutual labels:  distributed-systems
Moleculer
🚀 Progressive microservices framework for Node.js
Stars: ✭ 4,845 (-77.99%)
Mutual labels:  distributed-systems
Gifee
Google's Infrastructure for Everyone Else
Stars: ✭ 370 (-98.32%)
Mutual labels:  distributed-systems
Raft
Raft Consensus Algorithm
Stars: ✭ 370 (-98.32%)
Mutual labels:  distributed-systems
Distributed Systems Technologies And Cases Analysis
《分布式系统常用技术及案例分析》示例源码
Stars: ✭ 446 (-97.97%)
Mutual labels:  distributed-systems
Nuraft
C++ implementation of Raft core logic as a replication library
Stars: ✭ 428 (-98.06%)
Mutual labels:  distributed-systems
Awesome System Design
A curated list of awesome System Design (A.K.A. Distributed Systems) resources.
Stars: ✭ 4,999 (-77.29%)
Mutual labels:  distributed-systems

eXtreme Gradient Boosting

Build Status Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Conda version Optuna Twitter

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2021. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].