Self-transfer learning for weakly supervised lesion localization

68Citations
Citations of this article
212Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recent advances of deep learning have achieved remarkable performances in various computer vision tasks including weakly supervised object localization. Weakly supervised object localization is practically useful since it does not require fine-grained annotations. Current approaches overcome the difficulties of weak supervision via transfer learning from pre-trained models on large-scale general images such as ImageNet. However,they cannot be utilized for medical image domain in which do not exist such priors. In this work,we present a novel weakly supervised learning framework for lesion localization named as self-transfer learning (STL). STL jointly optimizes both classification and localization networks to help the localization network focus on correct lesions without any types of priors.We evaluate STL framework over chest X-rays and mammograms,and achieve significantly better localization performance compared to previous weakly supervised localization approaches.

Cite

CITATION STYLE

APA

Hwang, S., & Kim, H. E. (2016). Self-transfer learning for weakly supervised lesion localization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9901 LNCS, pp. 239–246). Springer Verlag. https://doi.org/10.1007/978-3-319-46723-8_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free