Abstract
Unlike sign language, which usually involves large-scale movements to form a gesture, finger language, suitable for handicapped aphasiacs, is represented by relatively small-scale hand gestures accessible by a mere change of the bending manner of a patient's fingers. Therefore, we need a system that can tackle the specificity of each handicapped aphasiac. We propose a system that fulfills this requirement by employing a programmable data glove to capture tiny movement-related finger gestures, an optical signal value-parameterized function to calculate the finger bending degrees, and an automatic regression module to extract most adequate finger features for a specific patient. The selected features are fed into a neural network, which learns to build a finger language recognition model for the specific patient. Then the system can be available for use by the specific user. At the time of this writing, the achieved average success rate was 100% from unbiased field experiments.
Cite
CITATION STYLE
Fu, Y. F., & Ho, C. S. (2009). A user-dependent easily-adjusted static finger language recognition system for handicapped aphasiacs. Applied Artificial Intelligence, 23(10), 932–944. https://doi.org/10.1080/08839510903363487
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.