On the importance of interpretable machine learning predictions to inform clinical decision making in oncology

24Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

Abstract

Machine learning-based tools are capable of guiding individualized clinical management and decision-making by providing predictions of a patient’s future health state. Through their ability to model complex nonlinear relationships, ML algorithms can often outperform traditional statistical prediction approaches, but the use of nonlinear functions can mean that ML techniques may also be less interpretable than traditional statistical methodologies. While there are benefits of intrinsic interpretability, many model-agnostic approaches now exist and can provide insight into the way in which ML systems make decisions. In this paper, we describe how different algorithms can be interpreted and introduce some techniques for interpreting complex nonlinear algorithms.

Cite

CITATION STYLE

APA

Lu, S. C., Swisher, C. L., Chung, C., Jaffray, D., & Sidey-Gibbons, C. (2023). On the importance of interpretable machine learning predictions to inform clinical decision making in oncology. Frontiers in Oncology. Frontiers Media S.A. https://doi.org/10.3389/fonc.2023.1129380

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free