Face and Gesture Based Human Computer Interaction

  • Tu Y
  • Kao C
  • Lin H
  • et al.
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we present a face and gesture based human computer interaction (HCI) system. We combine head pose and hand gesture to control the system. We can identify the positions of the eyes and mouth, and use the face center to estimate the pose of the head. Moreover, we introduce a technique for automatic gesture area segmentation and orientation normalization of the hand gesture. The user does not need to keep gestures in upright position and the system segments and normalizes the gestures automatically. The experimental results show that the proposed approach is accurate with gesture recognition rate of 93.6%. Also, the user can control multiple devices, including robots simultaneously through a wireless network.

Cite

CITATION STYLE

APA

Tu, Y.-J., Kao, C.-C., Lin, H.-Y., & Chang, C.-C. (2015). Face and Gesture Based Human Computer Interaction. International Journal of Signal Processing, Image Processing and Pattern Recognition, 8(9), 219–228. https://doi.org/10.14257/ijsip.2015.8.9.23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free