Vision-based bare-hand gesture interface for interactive augmented reality applications

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a barehanded interaction method for augmented reality games based on human hand gestures. Point features are tracked from input video frames and the motion of moving objects is computed. The moving patterns of the motion trajectories are used to determine whether the motion is an intended gesture. A smooth trajectory toward one of virtual objects or menus is classified as an intended gesture and the corresponding action is invoked. To prove the validity of the proposed method, we implemented two simple augmented reality applications: a gesture-based music player and a virtual basketball game. The experiments for three untrained users indicate that the accuracy of menu activation according to the intended gestures is 94% for normal speed gestures and 84% for fast and abrupt gestures. © IFIP International Federation for Information Processing 2006.

Cite

CITATION STYLE

APA

Yoon, J. H., Park, J. S., & Sung, M. Y. (2006). Vision-based bare-hand gesture interface for interactive augmented reality applications. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4161 LNCS, pp. 386–389). Springer Verlag. https://doi.org/10.1007/11872320_56

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free