Detecting, tracking, and interpretation of a pointing gesture by an overhead view camera

13Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this work we describe a set of visual routines, which support a novel sensor free interface between a human and virtual objects. The visual routines detect, track and interpret a gesture of pointing in real time. This is solved in the context of a scenario, which enables a user to activate virtual objects displayed on a projective screen. By changing a direction of pointing with an extended towards the screen arm, the user controls the motion of virtual objects. The vision system consists of a single overhead view camera and exploits a priori knowledge of the human body appearance, interactive context and environment. The system operates in real time on a standard Pentium-PC platform.

Cite

CITATION STYLE

APA

Kolesnik, M., & Kuleßa, T. (2001). Detecting, tracking, and interpretation of a pointing gesture by an overhead view camera. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2191, pp. 429–436). Springer Verlag. https://doi.org/10.1007/3-540-45404-7_57

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free