We present a system capable of visually detecting pointing gestures and estimating the 3D pointing direction in real-time. We use Hidden Markov Models (HMMs) trained on different phases of sample pointing gestures to detect the occurrence of a gesture. For estimating the pointing direction, we compare two approaches: 1) The line of sight between head and hand and 2) the forearm orientation. Input features for the HMMs are the 3D trajectories of the person's head and hands. They are extracted from image sequences provided by a stereo camera. In a person-independent test scenario, our system achieved a gesture detection rate of 88%. For 90% of the detected gestures, the correct pointing target (one out of eight objects) was identified. © Springer-Verlag Berlin Heidelberg 2003.
CITATION STYLE
Nickel, K., & Stiefelhagen, R. (2003). Real-time recognition of 3D-pointing gestures for human-machine-interaction. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2781, 557–565. https://doi.org/10.1007/978-3-540-45243-0_71
Mendeley helps you to discover research relevant for your work.