Explainable-AI in Automated Medical Report Generation Using Chest X-ray Images

21Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

The use of machine learning in healthcare has the potential to revolutionize virtually every aspect of the industry. However, the lack of transparency in AI applications may lead to the problem of trustworthiness and reliability of the information provided by these applications. Medical practitioners rely on such systems for clinical decision making, but without adequate explanations, diagnosis made by these systems cannot be completely trusted. Explainability in Artificial Intelligence (XAI) aims to improve our understanding of why a given output has been produced by an AI system. Automated medical report generation is one area that would benefit greatly from XAI. This survey provides an extensive literature review on XAI techniques used in medical image analysis and automated medical report generation. We present a systematic classification of XAI techniques used in this field, highlighting the most important features of each one that could be used by future research to select the most appropriate XAI technique to create understandable and reliable explanations for decisions made by AI systems. In addition to providing an overview of the state of the art in this area, we identify some of the most important issues that need to be addressed and on which research should be focused.

Cite

CITATION STYLE

APA

Ahmed, S. B., Solis-Oba, R., & Ilie, L. (2022). Explainable-AI in Automated Medical Report Generation Using Chest X-ray Images. Applied Sciences (Switzerland), 12(22). https://doi.org/10.3390/app122211750

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free