RGB-Infrared (IR) cross-modality person reidentification (re-ID), which aims to search an IR image in RGB gallery or vice versa, is a challenging task due to the large discrepancy between IR and RGB modalities. Existing methods address this challenge typically by aligning feature distributions or image styles across modalities, whereas the very useful similarities among gallery samples of the same modality (i.e. intra-modality sample similarities) are largely neglected. This paper presents a novel similarity inference metric (SIM) that exploits the intra-modality sample similarities to circumvent the cross-modality discrepancy targeting optimal cross-modality image matching. SIM works by successive similarity graph reasoning and mutual nearest-neighbor reasoning that mine cross-modality sample similarities by leveraging intra-modality sample similarities from two different perspectives. Extensive experiments over two cross-modality re-ID datasets (SYSU-MM01 and RegDB) show that SIM achieves significant accuracy improvement but with little extra training as compared with the state-of-the-art.
CITATION STYLE
Jia, M., Zhai, Y., Lu, S., Ma, S., & Zhang, J. (2020). A similarity inference metric for RGB-infrared cross-modality person re-identification. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 1026–1032). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/143
Mendeley helps you to discover research relevant for your work.