Dual-task self-supervision for cross-modality domain adaptation

13Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Data annotation is always an expensive and time-consuming issue for deep learning based medical image analysis. To ease the need of annotations, domain adaptation is recently introduced to generalize neural networks from a labeled source domain to unlabeled target domain without much performance degradation. In this paper, we propose a novel target domain self-supervision for domain adaptation by constructing an edge generation auxiliary task to assist primary segmentation task so as to extract better target representation and improve target segmentation performance. Besides, in order to leverage detailed information contained in low-level features, we propose a hierarchical low-level adversarial learning mechanism to encourage low-level features domain uninformative in a hierarchical way, so that the segmentation performance can benefit from low-level features without being affected by domain shift. Following these two proposed approach, we develop a cross-modality domain adaptation framework which employs the dual-task collaboration for target domain self-supervision, and encourages low-level detailed features domain uninformative for better alignment. Our proposed framework achieves state-of-the-art results on public cross-modality segmentation datasets.

Cite

CITATION STYLE

APA

Xue, Y., Feng, S., Zhang, Y., Zhang, X., & Wang, Y. (2020). Dual-task self-supervision for cross-modality domain adaptation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12261 LNCS, pp. 408–417). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59710-8_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free