Recent deep Re-ID models mainly focus on learning high-level semantic features, while failing to explicitly explore color information which is one of the most important cues for person Re-ID model. In this paper, we propose a novel Color-Sensitive Re-ID to take full advantage of color information. On one hand, we train our model with real and fake images. By using the extra fake images, more color information can be exploited and it can avoid over-fitting during training. On the other hand, we also train our model with images of the same person with different colors. By doing so, features can be forced to focus on the color difference in regions. To generate fake images with specified colors, we propose a novel Color Translation GAN (CTGAN) to learn mappings between different clothing colors and preserve identity consistency among the same person with the clothing color. Extensive evaluations on two benchmark datasets show that our approach significantly outperforms state-of-the-art Re-ID models.
CITATION STYLE
Wang, G., Yang, Y. Y., Cheng, J., Wang, J., & Hou, Z. (2019). Color-sensitive person re-identification. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 933–939). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/131
Mendeley helps you to discover research relevant for your work.