Semantic Map Guided Identity Transfer GAN for Person Re-identification

18Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Generative adversarial networks (GANs)-based person re-identification (re-id) schemes provide potential ways to augment data in practical applications. However, existing solutions perform poorly because of the separation of data generation and re-id training and a lack of diverse data in real-world scenarios. In this paper, a person re-id model (IDGAN) based on semantic map guided identity transfer GAN is proposed to improve the person re-id performance. With the aid of the semantic map, IDGAN generates pedestrian images with varying poses, perspectives, and backgrounds efficiently and accurately, improving the diversity of training data. To increase the visual realism, IDGAN utilizes a gradient augmentation method based on local quality attention to refine the generated image locally. Then, a two-stage joint training framework is employed to allow the GAN and the person re-id network to learn from each other to better use the generated data. Detailed experimental results demonstrate that, compared with the existing state-of-the-art methods, IDGAN is capable of producing high-quality images and significantly enhancing re-id performance, with the FID of generated images on the Market-1501 dataset being reduced by 1.15, and mAP on the Market-1501 and DukeMTMC-reID datasets being increased by 3.3% and 2.6%, respectively.

Cite

CITATION STYLE

APA

Wu, T., Zhu, R., & Wan, S. (2024). Semantic Map Guided Identity Transfer GAN for Person Re-identification. ACM Transactions on Multimedia Computing, Communications and Applications, 20(11). https://doi.org/10.1145/3631355

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free