Diagnosis of Paratuberculosis in Histopathological Images Based on Explainable Artificial Intelligence and Deep Learning

8Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

Artificial intelligence holds great promise in medical imaging, especially histopathological imaging. However, artificial intelligence algorithms cannot fully explain the thought processes during decision-making. This situation has brought the problem of explainability, i.e., the black box problem, of artificial intelligence applications to the agenda: an algorithm simply responds without stating the reasons for the given images. To overcome the problem and improve the explainability, explainable artificial intelligence (XAI) has come to the fore, and piqued the interest of many researchers. Against this backdrop, this study examines a new and original dataset using the deep learning algorithm, and visualizes the output with gradient-weighted class activation mapping (Grad-CAM), one of the XAI applications. Afterwards, a detailed questionnaire survey was conducted with the pathologists on these images. Both the decision-making processes and the explanations were verified, and the accuracy of the output was tested. The research results greatly help pathologists in the diagnosis of paratuberculosis.

Cite

CITATION STYLE

APA

Yiğit, T., Şengöz, N., Özmen, Ö., Hemanth, J., & Işık, A. H. (2022). Diagnosis of Paratuberculosis in Histopathological Images Based on Explainable Artificial Intelligence and Deep Learning. Traitement Du Signal, 39(3), 863–869. https://doi.org/10.18280/ts.390311

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free