All Projects → jrzaurin → Lightgbm With Focal Loss

jrzaurin / Lightgbm With Focal Loss

An implementation of the focal loss to be used with LightGBM for binary and multi-class classification problems

Programming Languages

python
139335 projects - #7 most used programming language
python3
1442 projects

Labels

Projects that are alternatives of or similar to Lightgbm With Focal Loss

HousePrice
住房月租金预测大数据赛TOP1
Stars: ✭ 17 (-83.5%)
Mutual labels:  lightgbm
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+529.13%)
Mutual labels:  lightgbm
Lambda Packs
Precompiled packages for AWS Lambda
Stars: ✭ 997 (+867.96%)
Mutual labels:  lightgbm
Leaves
pure Go implementation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks
Stars: ✭ 261 (+153.4%)
Mutual labels:  lightgbm
Ai competitions
AI比赛相关信息汇总
Stars: ✭ 443 (+330.1%)
Mutual labels:  lightgbm
Text Classification Benchmark
文本分类基准测试
Stars: ✭ 18 (-82.52%)
Mutual labels:  lightgbm
mobileRiskUser
基于移动网络通讯行为的风险用户识别 (15th/624)
Stars: ✭ 29 (-71.84%)
Mutual labels:  lightgbm
Dc Hi guides
[Data Castle 算法竞赛] 精品旅行服务成单预测 final rank 11
Stars: ✭ 83 (-19.42%)
Mutual labels:  lightgbm
Openscoring
REST web service for the true real-time scoring (<1 ms) of Scikit-Learn, R and Apache Spark models
Stars: ✭ 536 (+420.39%)
Mutual labels:  lightgbm
Open Solution Value Prediction
Open solution to the Santander Value Prediction Challenge 🐠
Stars: ✭ 34 (-66.99%)
Mutual labels:  lightgbm
My Data Competition Experience
本人多次机器学习与大数据竞赛Top5的经验总结,满满的干货,拿好不谢
Stars: ✭ 271 (+163.11%)
Mutual labels:  lightgbm
Open Solution Home Credit
Open solution to the Home Credit Default Risk challenge 🏡
Stars: ✭ 397 (+285.44%)
Mutual labels:  lightgbm
Autodl
Automated Deep Learning without ANY human intervention. 1'st Solution for AutoDL [email protected]
Stars: ✭ 854 (+729.13%)
Mutual labels:  lightgbm
Dmtk
Microsoft Distributed Machine Learning Toolkit
Stars: ✭ 2,766 (+2585.44%)
Mutual labels:  lightgbm
Lightgbm predict4j
A java implementation of LightGBM predicting part
Stars: ✭ 64 (-37.86%)
Mutual labels:  lightgbm
HyperGBM
A full pipeline AutoML tool for tabular data
Stars: ✭ 172 (+66.99%)
Mutual labels:  lightgbm
Awesome Gradient Boosting Papers
A curated list of gradient boosting research papers with implementations.
Stars: ✭ 704 (+583.5%)
Mutual labels:  lightgbm
Learning to rank
利用lightgbm做(learning to rank)排序学习,包括数据处理、模型训练、模型决策可视化、模型可解释性以及预测等。
Stars: ✭ 92 (-10.68%)
Mutual labels:  lightgbm
Mlbox
MLBox is a powerful Automated Machine Learning python library.
Stars: ✭ 1,199 (+1064.08%)
Mutual labels:  lightgbm
Mljar Supervised
Automated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+833.01%)
Mutual labels:  lightgbm

LightGBM with Focal Loss

This is implementation of the Focal Loss[1] to be used with LightGBM.

The companion Medium post can be found here.

The Focal Loss for LightGBM[2] can be simply coded as:

def focal_loss_lgb(y_pred, dtrain, alpha, gamma):
	a,g = alpha, gamma
	y_true = dtrain.label
	def fl(x,t):
		p = 1/(1+np.exp(-x))
		return -( a*t + (1-a)*(1-t) ) * (( 1 - ( t*p + (1-t)*(1-p)) )**g) * ( t*np.log(p)+(1-t)*np.log(1-p) )
	partial_fl = lambda x: fl(x, y_true)
	grad = derivative(partial_fl, y_pred, n=1, dx=1e-6)
	hess = derivative(partial_fl, y_pred, n=2, dx=1e-6)
	return grad, hess

to use it one would need the corresponding evaluation function:

def focal_loss_lgb_eval_error(y_pred, dtrain, alpha, gamma):
	a,g = alpha, gamma
	y_true = dtrain.label
	p = 1/(1+np.exp(-y_pred))
	loss = -( a*y_true + (1-a)*(1-y_true) ) * (( 1 - ( y_true*p + (1-y_true)*(1-p)) )**g) * ( y_true*np.log(p)+(1-y_true)*np.log(1-p) )
	return 'focal_loss', np.mean(loss), False

And to use it, simply:

focal_loss = lambda x,y: focal_loss_lgb(x, y, 0.25, 1.)
eval_error = lambda x,y: focal_loss_lgb_eval_error(x, y, 0.25, 1.)
lgbtrain = lgb.Dataset(X_tr, y_tr, free_raw_data=True)
lgbeval = lgb.Dataset(X_val, y_val)
params  = {'learning_rate':0.1, 'num_boost_round':10}
model = lgb.train(params, lgbtrain, valid_sets=[lgbeval], fobj=focal_loss, feval=eval_error )

In the examples directory you will find more details, including how to use Hyperopt in combination with LightGBM and the Focal Loss, or how to adapt the Focal Loss to a multi-class classification problem.

Any comment: [email protected]

References:

[1] Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He, Piotr Dollár. Focal Loss for Dense Object Detection

[2] Guolin Ke, Qi Meng Thomas Finley, et al., 2017. LightGBM: A Highly Efficient Gradient Boosting Decision Tree

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].