Facilitated gesture recognition based interfaces for people with upper extremity physical impairments

11Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

A gesture recognition based interface was developed to facilitate people with upper extremity physical impairments as an alternative way to perform laboratory experiments that require 'physical' manipulation of components. A color, depth and spatial information based particle filter framework was constructed with unique descriptive features for face and hands representation. The same feature encoding policy was subsequently used to detect, track and recognize users' hands. Motion models were created employing dynamic time warping (DTW) method for better observation encoding. Finally, the hand trajectories were classified into different classes (commands) by applying the CONDENSATION method and, in turn, an interface was designed for robot control, with a recognition accuracy of 97.5%. To assess the gesture recognition and control policies, a validation experiment consisting in controlling a mobile service robot and a robotic arm in a laboratory environment was conducted. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Jiang, H., Wachs, J. P., & Duerstock, B. S. (2012). Facilitated gesture recognition based interfaces for people with upper extremity physical impairments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7441 LNCS, pp. 228–235). https://doi.org/10.1007/978-3-642-33275-3_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free