Human-specified appearance features are widely used for person reidentification at present, such as color and texture histograms. Often, these features are limited by the subjective appearance of pedestrians. This paper presents a new representation to re-identification that incorporates data-driven features to improve the reliability and robustness in person matching. Firstly, we utilize a deep learning network, namely PCA Network, to learn data-driven features from person images. The features mine more discriminative cues from pedestrian data and compensate the drawback of human-specified features. Then the data-driven features and common human-specified features are combined to produce a final representation of each image. The so-obtained enriched Data-driven Representation (eDR) has been validated through experiments on two person reidentification datasets, demonstrating that the proposed representation is effective for person matching. That is, the data-driven features facilitate more accurate reidentification when they are fused together with the human-specified features.
CITATION STYLE
Li, X., Gao, J., Chang, X., Mai, Y., & Zheng, W. S. (2014). Person re-identification with data-driven features. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8833, 506–513. https://doi.org/10.1007/978-3-319-12484-1_58
Mendeley helps you to discover research relevant for your work.