Fusion high-resolution network for diagnosing chestX-ray images

19Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

The application of deep convolutional neural networks (CNN) in the field of medical image processing has attracted extensive attention and demonstrated remarkable progress. An increasing number of deep learning methods have been devoted to classifying ChestX-ray (CXR) images, and most of the existing deep learning methods are based on classic pretrained models, trained by global ChestX-ray images. In this paper, we are interested in diagnosing ChestX-ray images using our proposed Fusion High-Resolution Network (FHRNet). The FHRNet concatenates the global average pooling layers of the global and local feature extractors—it consists of three branch convolutional neural networks and is fine-tuned for thorax disease classification. Compared with the results of other available methods, our experimental results showed that the proposed model yields a better disease classification performance for the ChestX-ray 14 dataset, according to the receiver operating characteristic curve and area-under-the-curve score. An ablation study further confirmed the effectiveness of the global and local branch networks in improving the classification accuracy of thorax diseases.

Cite

CITATION STYLE

APA

Huang, Z., Lin, J., Xu, L., Wang, H., Bai, T., Pang, Y., & Meen, T. H. (2020). Fusion high-resolution network for diagnosing chestX-ray images. Electronics (Switzerland), 9(1). https://doi.org/10.3390/electronics9010190

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free