Few-shot Medical Image Segmentation Regularized with Self-reference and Contrastive Learning

9Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Despite the great progress made by deep convolutional neural networks (CNN) in medical image segmentation, they typically require a large amount of expert-level accurate, densely-annotated images for training and are difficult to generalize to unseen object categories. Few-shot learning has thus been proposed to address the challenges by learning to transfer knowledge from a few annotated support examples. In this paper, we propose a new prototype-based few-shot segmentation method. Unlike previous works, where query features are compared with the learned support prototypes to generate segmentation over the query images, we propose a self-reference regularization where we further compare support features with the learned support prototypes to generate segmentation over the support images. By this, we argue for that the learned support prototypes should be representative for each semantic class and meanwhile discriminative for different classes, not only for query images but also for support images. We additionally introduce contrastive learning to impose intra-class cohesion and inter-class separation between support and query features. Results from experiments conducted on two publicly available datasets demonstrated the superior performance of the proposed method over the state-of-the-art (SOTA).

Cite

CITATION STYLE

APA

Wang, R., Zhou, Q., & Zheng, G. (2022). Few-shot Medical Image Segmentation Regularized with Self-reference and Contrastive Learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13434 LNCS, pp. 514–523). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-16440-8_49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free