Mammography is a widely used screening tool for breast cancer, and accurate diagnosis is critical for the effective management of breast cancer. In this study, we propose a novel cross-view mutual learning method that leverages a Cross-view Masked Autoencoder (CMAE) and a Dual-View Affinity Matrix (DAM) to extract cross-view features and facilitate malignancy classification in mammography. CMAE aims to extract the underlying features from multi-view mammography data without relying on lesion labeling information or multi-view registration. DAM helps overcome the limitations of single-view models and identifies unique patterns and features in each view, thereby improving the accuracy and robustness of breast tissue representations. We evaluate our approach on a large-scale in-house mammography dataset and demonstrate promising results compared to existing methods. Additionally, we perform an ablation analysis to investigate the influence of different loss functions on the performance of our method. The results show that all the proposed components contribute positively to the final performance. In summary, the proposed cross-view mutual learning method shows great potential for assisting malignant classification.
CITATION STYLE
Wu, Q., Tan, H., Qiao, Z., Dong, P., Shen, D., Wang, M., & Xue, Z. (2024). Cross-view Contrastive Mutual Learning Across Masked Autoencoders for Mammography Diagnosis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 14349 LNCS, pp. 74–83). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-45676-3_8
Mendeley helps you to discover research relevant for your work.