A real-time hand gesture interface for medical visualization applications

43Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we consider a vision-based system that can interpret a user's gestures in real time to manipulate objects within a medical data visualization environment. Dynamic navigation gestures are translated to commands based on their relative positions on the screen. Static gesture poses are identified to execute non-directional commands. This is accom-plished by using Haar-like features to represent the shape of the hand. These features are then input to a Fuzzy C-Means Clustering algorithm for pose classification. A probabilistic neighborhood search algorithm is employed to automatically select a small number of Haar features, and to tune the fuzzy c-means classification algorithm. The gesture recognition system was implemented in a sterile medical data-browser environment. Test results on four interface tasks showed that the use of a few Haar features with the supervised FCM yielded successful performance rates of 95 to 100%. In addition a small exploratory test of the Adaboost Haar system was made to detect a single hand gesture, and assess its suitability for hand gesture recognition.

Cite

CITATION STYLE

APA

Wachs, J., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2006). A real-time hand gesture interface for medical visualization applications. In Advances in Soft Computing (Vol. 36, pp. 153–162). https://doi.org/10.1007/978-3-540-36266-1_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free