Multi-modal Magnetic Resonance Imaging (MRI) plays a crucial role in brain tumor segmentation. However, missing modality is a common phenomenon in clinical practice, leading to performance degradation in tumor segmentation. Considering that there exist complementary information among modalities, feature interaction among modalities is important for tumor segmentation. In this work, we propose Modality-adaptive Feature Interaction (MFI) with multi-modal code to adaptively interact features among modalities in different modality missing situations. MFI is a simple yet effective unit, based on graph structure and attention mechanism, to learn and interact complementary features between graph nodes (modalities). Meanwhile, the proposed multi-modal code, indicating whether each modality is missing or not, guides MFI to learn adaptive complementary information between nodes in different missing situations. Applying MFI with multi-modal code in different stages of a U-shaped architecture, we design a novel network U-Net-MFI to interact multi-modal features hierarchically and adaptively for brain tumor segmentation with missing modality(ies). Experiments show that our model outperforms the current state-of-the-art methods for brain tumor segmentation with missing modalities.
CITATION STYLE
Zhao, Z., Yang, H., & Sun, J. (2022). Modality-Adaptive Feature Interaction for Brain Tumor Segmentation with Missing Modalities. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13435 LNCS, pp. 183–192). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-16443-9_18
Mendeley helps you to discover research relevant for your work.