POINTING GESTURE VISUAL RECOGNITION BY BODY FEATURE DETECTION AND TRACKING

  • Carbini S
  • Viallet J
  • Bernier O
N/ACitations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Among gestures naturally performed by users during communication, pointing gestures can be easily recognized and included in more natural new Human Computer Interfaces. We approximate the eye-finger pointing direction of a user by detecting and tracking, in real time, the 3D positions of the centre of the face and of both hands; the positions are obtained by a stereoscopic device located on the top of the display. From the head position and biometric constraints, we define both a rest area and an action area. In this former area, the hands are searched for and the pointing intention is detected. The first hand spontaneously moved forward by the user is defined as the pointing hand whereas the second detected hand, when it first moves forwards, is considered as the selection hand. Experiments on spatial precision, carried out with a group of users, show that the minimum size of an object to be easily pointed at is some 1.5 percent of the diagonal of the large display.

Cite

CITATION STYLE

APA

Carbini, S., Viallet, J. E., & Bernier, O. (2006). POINTING GESTURE VISUAL RECOGNITION BY BODY FEATURE DETECTION AND TRACKING. In Computer Vision and Graphics (pp. 203–208). Kluwer Academic Publishers. https://doi.org/10.1007/1-4020-4179-9_29

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free