We propose a novel approach for class-agnostic object proposal generation, which is efficient and especially well-suited to detect small objects. Efficiency is achieved by scale-specific objectness attention maps which focus the processing on promising parts of the image and reduce the amount of sampled windows strongly. This leads to a system, which is $$33\%$$ faster than the state-of-the-art and clearly outperforming state-of-the-art in terms of average recall. Secondly, we add a module for detecting small objects, which are often missed by recent models. We show that this module improves the average recall for small objects by about $$53\%$$. Our implementation is available at: https://www.inf.uni-hamburg.de/en/inst/ab/cv/people/wilms/attentionmask.
CITATION STYLE
Wilms, C., & Frintrop, S. (2019). AttentionMask: Attentive, Efficient Object Proposal Generation Focusing on Small Objects. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11362 LNCS, pp. 678–694). Springer Verlag. https://doi.org/10.1007/978-3-030-20890-5_43
Mendeley helps you to discover research relevant for your work.