A proposed gradient tree boosting with different loss function in crime forecasting and analysis

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gradient tree boosting (GTB) is a newly emerging artificial intelligence technique in crime forecasting. GTB is a stage-wise additive framework that adopts numerical optimisation methods to minimise the loss function of the predictive model which later enhances it predictive capabilities. The applied loss function plays critical roles that determine GTB predictive capabilities and performance. GTB uses the least square function as its standard loss function. Motivated by this limitation, the study is conducted to observe and identify a potential replacement for the current loss function in GTB by applying a different existing standard mathematical function. In this study, the crime models are developed based on GTB with a different loss function to compare its forecasting performance. From this case study, it is found that among the tested loss functions, the least absolute deviation function outperforms other loss functions including the GTB standard least square loss function in all developed crime models.

Cite

CITATION STYLE

APA

Khairuddin, A. R., Alwee, R., & Haron, H. (2020). A proposed gradient tree boosting with different loss function in crime forecasting and analysis. In Advances in Intelligent Systems and Computing (Vol. 1073, pp. 189–198). Springer. https://doi.org/10.1007/978-3-030-33582-3_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free