Web-Based Multimodal Deep Learning Platform with XRAI Explainability for Real-Time Skin Lesion Classification and Clinical Decision Support

0Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

Abstract

Background: Skin cancer represents one of the most prevalent malignancies worldwide, with melanoma accounting for approximately 75% of skin cancer-related deaths despite comprising fewer than 5% of cases. Early detection dramatically improves survival rates from 14% to over 99%, highlighting the urgent need for accurate and accessible diagnostic tools. While deep learning has shown promise in dermatological diagnosis, existing approaches lack clinical explainability and deployable interfaces that bridge the gap between research innovation and practical healthcare applications. Methods: This study implemented a comprehensive multimodal deep learning framework using the HAM10000 dataset (10,015 dermatoscopic images across seven diagnostic categories). Three CNN architectures (DenseNet-121, EfficientNet-B3, ResNet-50) were systematically compared, integrating patient metadata, including age, sex, and anatomical location, with dermatoscopic image analysis. The first implementation of XRAI (eXplanation with Region-based Attribution for Images) explainability for skin lesion classification was developed, providing spatially coherent explanations aligned with clinical reasoning patterns. A deployable web-based clinical interface was created, featuring real-time inference, comprehensive safety protocols, risk stratification, and evidence-based cosmetic recommendations for benign conditions. Results: EfficientNet-B3 achieved superior performance with 89.09% test accuracy and 90.08% validation accuracy, significantly outperforming DenseNet-121 (82.83%) and ResNet-50 (78.78%). Test-time augmentation improved performance by 1.00 percentage point to 90.09%. The model demonstrated excellent performance for critical malignant conditions: melanoma (81.6% confidence), basal cell carcinoma (82.1% confidence), and actinic keratoses (88% confidence). XRAI analysis revealed clinically meaningful attention patterns focusing on irregular pigmentation for melanoma, ulcerated borders for basal cell carcinoma, and surface irregularities for precancerous lesions. Error analysis showed that misclassifications occurred primarily in visually ambiguous cases with high correlation (0.855–0.968) between model attention and ideal features. The web application successfully validated real-time diagnostic capabilities with appropriate emergency protocols for malignant conditions and comprehensive cosmetic guidance for benign lesions. Conclusions: This research successfully developed the first clinically deployable skin lesion classification system combining diagnostic accuracy with explainable AI and practical patient guidance. The integration of XRAI explainability provides essential transparency for clinical acceptance, while the web-based deployment democratizes access to advanced dermatological AI capabilities. Comprehensive validation establishes readiness for controlled clinical trials and potential integration into healthcare workflows, particularly benefiting underserved regions with limited specialist availability. This work bridges the critical gap between research-grade AI models and practical clinical utility, establishing a foundation for responsible AI integration in dermatological practice.

Cite

CITATION STYLE

APA

Aksoy, S., Demircioglu, P., & Bogrekci, I. (2025). Web-Based Multimodal Deep Learning Platform with XRAI Explainability for Real-Time Skin Lesion Classification and Clinical Decision Support. Cosmetics, 12(5). https://doi.org/10.3390/cosmetics12050194

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free