Robust hand detection for augmented reality interface

14Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For interactive augmented reality, vision-based and hand-gesture-based interface are most desirable due to being natural and human-friendly. However, detecting hands and recognizing hand gestures in cluttered background are still challenging. Especially, if the background includes a large skin-colored region, the problem becomes more difficult. In this paper, we focus on detecting a hand reliably and propose an effective method. Our method is basically based on the assumption that a hand-forearm region (including a hand and part of a forearm) has different brightness from other skin-colored regions. Specifically, we first segment the hand-forearm region from other skin-colored regions based on the brightness difference which is represented by edges in this paper. Then, we extract the hand region from the hand-forearm region by detecting a feature point which indicates the wrist. Finally, we extract the hand by using the brightness-based segmentation which is slightly different from the hand-forearm region detection. We verify the effectiveness of our method by implementing a simple hand gesture interface based on our method and applying it to augmented reality applications. Copyright © 2009 by the Association for Computing Machinery, Inc.

Cite

CITATION STYLE

APA

Choi, J., Seo, B. K., & Park, J. I. (2009). Robust hand detection for augmented reality interface. In Proceedings - VRCAI 2009: 8th International Conference on Virtual Reality Continuum and its Applications in Industry (pp. 319–321). https://doi.org/10.1145/1670252.1670324

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free