Robust 3D arm tracking from monocular videos

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present a robust method to tackle the ambiguities in 3D arm tracking, especially those introduced by depth change (distance of the arm from the camera), and arm rotation about humerus (upper arm bone). In a particle filter framework, the arm joint angle configurations are monitored and the occurrences of the ambiguous arm movements are detected. Inverse kinematics is applied to transfer invalid joint angle configurations from unconstrained movement space into constrained space. Experimental results have demonstrated the efficacy of the proposed approach. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Guo, F., & Qian, G. (2005). Robust 3D arm tracking from monocular videos. In Lecture Notes in Computer Science (Vol. 3645, pp. 841–850). Springer Verlag. https://doi.org/10.1007/11538356_87

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free