Statistical analysis of humans, their motion and their behaviour is a very well-studied problem. With the availability of accurate motion capture systems, it has become possible to use such analysis for animation, understanding, compression and tracking of human motion. At the core of the analysis lies a measure for determining the distance between two human poses; practically always, this measure is the Euclidean distance between joint angle vectors. Recent work [7] has shown that articulated tracking systems can be vastly improved by replacing the Euclidean distance in joint angle space with the geodesic distance in the space of joint positions. However, due to the focus on tracking, no algorithms have, so far, been presented for measuring these distances between human poses. In this paper, we present an algorithm for computing geodesics in the Riemannian space of joint positions, as well as a fast approximation that allows for large-scale analysis. In the experiments we show that this measure significantly outperforms the traditional measure in classification, clustering and dimensionality reduction tasks. © 2012 Springer-Verlag.
CITATION STYLE
Hauberg, S., & Steenstrup Pedersen, K. (2012). Spatial measures between human poses for classification and understanding. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7378 LNCS, pp. 26–36). https://doi.org/10.1007/978-3-642-31567-1_3
Mendeley helps you to discover research relevant for your work.