Real-time hand gesture recognition for uncontrolled environments using adaptive SURF tracking and hidden conditional random fields

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Challenges from the uncontrolled environments are the main difficulties in making hand gesture recognition methods robust in real-world scenarios. In this paper, we propose a real-time and purely vision-based method for hand gesture recognition in uncontrolled environments. A novel tracking method is introduced to track multiple hand candidates from the first frame. The movement directions of all hand candidates are extracted as trajectory features. A modified HCRF model is used to classify gestures. The proposed method can survive challenges including: gesturing hand out of the scene, pause during gestures, complex background, skin-coloured regions moving in background, performers wearing short sleeve and face overlapping with hand. The method has been tested on Palm Graffiti Digits database and Warwick Hand Gesture database. Experimental results show that the proposed method can perform well in uncontrolled environments. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Yao, Y., & Li, C. T. (2013). Real-time hand gesture recognition for uncontrolled environments using adaptive SURF tracking and hidden conditional random fields. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8034 LNCS, pp. 542–551). https://doi.org/10.1007/978-3-642-41939-3_53

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free