Pose-Based Temporal-Spatial Network (PTSN) for Gait Recognition with Carrying and Clothing Variations

150Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One of the most attractive biometric techniques is gait recognition, since its potential for human identification at a distance. But gait recognition is still challenging in real applications due to the effect of many variations on the appearance and shape. Usually, appearance-based methods need to compute gait energy image (GEI) which is extracted from the human silhouettes. GEI is an image that is obtained by averaging the silhouettes and as result the temporal information is removed. The body joints are invariant to changing clothing and carrying conditions. We propose a novel pose-based gait recognition approach that is more robust to the clothing and carrying variations. At the same time, a pose-based temporal-spatial network (PTSN) is proposed to extract the temporal-spatial features, which effectively improve the performance of gait recognition. Experiments evaluated on the challenging CASIA B dataset, show that our method achieves state-of-the-art performance in both carrying and clothing conditions.

Cite

CITATION STYLE

APA

Liao, R., Cao, C., Garcia, E. B., Yu, S., & Huang, Y. (2017). Pose-Based Temporal-Spatial Network (PTSN) for Gait Recognition with Carrying and Clothing Variations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10568 LNCS, pp. 474–483). Springer Verlag. https://doi.org/10.1007/978-3-319-69923-3_51

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free