Gesture recognition for human-robot interaction through a knowledge based software platform

5Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The task of real-time hand gesture recognition is extremely challenging due to a number of DOFs of hand pose and motion. However, for human-robot interaction in natural ways, gesture can provide a powerful interface tool for commanding a robot to perform a specific task. This paper presents a vision-based real-time gesture recognition system by segmenting the three largest skin color components and template-matching techniques with multiple features. Gesture commands are generated whenever the combinations of three skin-like regions at a particular image match with the predefined gestures. These gesture commands are sent to robots through a knowledge based software platform for human-robot interaction. The effectiveness of our method has been demonstrated by interacting with an entertainment robot named AIBO. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Hasanuzzaman, M., Zhang, T., Ampornaramveth, V., Bhuiyan, M. A., Shirai, Y., & Ueno, H. (2004). Gesture recognition for human-robot interaction through a knowledge based software platform. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3211, 530–537. https://doi.org/10.1007/978-3-540-30125-7_66

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free