A Multi-Scale Cross-Fusion Medical Image Segmentation Network Based on Dual-Attention Mechanism Transformer

1Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

The U-net network, with its simple and powerful encoder–decoder structure, dominates the field of medical image segmentation. However, convolution operations are limited by receptive fields. They do not have the ability to model long-range dependencies, but Transformer has the capability of long-term modeling thanks to its core self-attention mechanism, which has been widely applied in the field of medical image segmentation. However, both CNNs and Transformer can only perform correlation calculations for a single sample, ignoring the correlation between different samples. To address these problems, we propose a new Transformer, which we call the Dual-Attention Transformer (DAT). This module captures correlations within a single sample while also learning correlations between different samples. The current U-net and some of its variant models have the problem of inadequate feature fusion, so we also improve the skip connection to strengthen the association between feature maps at different scales, reduce the semantic gap between the encoder and decoder, and further improve the segmentation performance. We refer to this structure as DATUnet. We conducted extensive experiments on the Synapse and ACDC datasets to validate the superior performance of our network, and we achieved an average DSC (%) of 83.6 and 90.9 and an average HD95 of 13.99 and 1.466 for the Synapse and ACDC datasets, respectively.

Cite

CITATION STYLE

APA

Cui, J., Wang, L., & Jiang, S. (2023). A Multi-Scale Cross-Fusion Medical Image Segmentation Network Based on Dual-Attention Mechanism Transformer. Applied Sciences (Switzerland), 13(19). https://doi.org/10.3390/app131910881

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free