Human-computer interaction based on hand gestures using RGB-D sensors

76Citations
Citations of this article
101Readers
Mendeley users who have this article in their library.

Abstract

In this paper we present a new method for hand gesture recognition based on an RGB-D sensor. The proposed approach takes advantage of depth information to cope with the most common problems of traditional video-based hand segmentation methods: cluttered backgrounds and occlusions. The algorithm also uses colour and semantic information to accurately identify any number of hands present in the image. Ten different static hand gestures are recognised, including all different combinations of spread fingers. Additionally, movements of an open hand are followed and 6 dynamic gestures are identified. The main advantage of our approach is the freedom of the user's hands to be at any position of the image without the need of wearing any specific clothing or additional devices. Besides, the whole method can be executed without any initial training or calibration. Experiments carried out with different users and in different environments prove the accuracy and robustness of the method which, additionally, can be run in real-time. © 2013 by the authors.

Cite

CITATION STYLE

APA

Palacios, J. M., Sagués, C., Montijano, E., & Llorente, S. (2013). Human-computer interaction based on hand gestures using RGB-D sensors. Sensors (Switzerland), 13(9), 11842–11860. https://doi.org/10.3390/s130911842

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free