Real-time recognition of 3D-pointing gestures for human-machine-interaction

18Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a system capable of visually detecting pointing gestures and estimating the 3D pointing direction in real-time. We use Hidden Markov Models (HMMs) trained on different phases of sample pointing gestures to detect the occurrence of a gesture. For estimating the pointing direction, we compare two approaches: 1) The line of sight between head and hand and 2) the forearm orientation. Input features for the HMMs are the 3D trajectories of the person's head and hands. They are extracted from image sequences provided by a stereo camera. In a person-independent test scenario, our system achieved a gesture detection rate of 88%. For 90% of the detected gestures, the correct pointing target (one out of eight objects) was identified. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Nickel, K., & Stiefelhagen, R. (2003). Real-time recognition of 3D-pointing gestures for human-machine-interaction. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2781, 557–565. https://doi.org/10.1007/978-3-540-45243-0_71

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free