Transformer Lesion Tracker

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Evaluating lesion progression and treatment response via longitudinal lesion tracking plays a critical role in clinical practice. Automated approaches for this task are motivated by prohibitive labor costs and time consumption when lesion matching is done manually. Previous methods typically lack the integration of local and global information. In this work, we propose a transformer-based approach, termed Transformer Lesion Tracker (TLT). Specifically, we design a Cross Attention-based Transformer (CAT) to capture and combine both global and local information to enhance feature extraction. We also develop a Registration-based Anatomical Attention Module (RAAM) to introduce anatomical information to CAT so that it can focus on useful feature knowledge. A Sparse Selection Strategy (SSS) is presented for selecting features and reducing memory footprint in Transformer training. In addition, we use a global regression to further improve model performance. We conduct experiments on a public dataset to show the superiority of our method and find that our model performance has improved the average Euclidean center error by at least 14.3% (6 mm vs. 7 mm) compared with the state-of-the-art (SOTA). Code is available at https://github.com/TangWen920812/TLT.

Cite

CITATION STYLE

APA

Tang, W., Kang, H., Zhang, H., Yu, P., Arnold, C. W., & Zhang, R. (2022). Transformer Lesion Tracker. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13436 LNCS, pp. 196–206). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-16446-0_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free