EXPLAINING XGBOOST PREDICTIONS WITH SHAP VALUE: A COMPREHENSIVE GUIDE TO INTERPRETING DECISION TREE-BASED MODELS

  • Ergün S
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Understanding the factors that affect Key Performance Indicators (KPIs) and how they affect them is frequently important in sectors where data and data science are crucial. Machine learning is utilized to model and predict pertinent KPIs in order to do this. Interpretability is important, nevertheless, in order to fully comprehend how the model generates its predictions. It enables users to pinpoint which traits have aided the model’s ability to learn and comprehend the data. A practical approach for evaluating the contribution of input attributes to model learning has evolved in the form of SHAP (SHapley Additive exPlanations offer an index for evaluating the influence of each feature on the forecasts made by the model. In this paper, it is demonstrated that the contribution of features to model learning may be precisely estimated when utilizing SHAP values with decision tree-based models, which are frequently used to represent tabular data.

Cite

CITATION STYLE

APA

Ergün, S. (2023). EXPLAINING XGBOOST PREDICTIONS WITH SHAP VALUE: A COMPREHENSIVE GUIDE TO INTERPRETING DECISION TREE-BASED MODELS. New Trends in Computer Sciences, 1(1), 19–31. https://doi.org/10.3846/ntcs.2023.17901

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free