Extreme gradient boosting with squared logistic loss function

10Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Tree boosting has empirically proven to be a highly effective and versatile approach for predictive modeling. The core argument is that tree boosting can adaptively determine the local neighborhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this manuscript, performance accuracy of XGBoost is further enhanced by applying a loss function named squared logistics loss (SqLL). Accuracy of the proposed algorithm, i.e., XGBoost with SqLL, is evaluated using test/train method, K-fold cross-validation, and stratified cross-validation method.

Cite

CITATION STYLE

APA

Sharma, N., Anju, & Juneja, A. (2019). Extreme gradient boosting with squared logistic loss function. In Advances in Intelligent Systems and Computing (Vol. 748, pp. 313–322). Springer Verlag. https://doi.org/10.1007/978-981-13-0923-6_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free