Tree boosting has empirically proven to be a highly effective and versatile approach for predictive modeling. The core argument is that tree boosting can adaptively determine the local neighborhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this manuscript, performance accuracy of XGBoost is further enhanced by applying a loss function named squared logistics loss (SqLL). Accuracy of the proposed algorithm, i.e., XGBoost with SqLL, is evaluated using test/train method, K-fold cross-validation, and stratified cross-validation method.
CITATION STYLE
Sharma, N., Anju, & Juneja, A. (2019). Extreme gradient boosting with squared logistic loss function. In Advances in Intelligent Systems and Computing (Vol. 748, pp. 313–322). Springer Verlag. https://doi.org/10.1007/978-981-13-0923-6_27
Mendeley helps you to discover research relevant for your work.