User-Defined Hand Gesture Interface to Improve User Experience of Learning American Sign Language

1Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sign language can make possible effective communication between hearing and deaf-mute people. Despite years of extensive pedagogical research, learning sign language remains a formidable task, with the majority of the current systems relying extensively on online learning resources, presuming that users would regularly access them; yet, this approach can feel monotonous and repetitious. Recently, gamification has been proposed as a solution to the problem, however, the research focus is on game design, rather than user experience design. In this work, we present a system for user-defined interaction for learning static American Sign Language (ASL), supporting gesture recognition for user experience design, and enabling users to actively learn through involvement with user-defined gestures, rather than just passively absorbing knowledge. Early findings from a questionnaire-based survey show that users are more motivated to learn static ASL through user-defined interactions.

Cite

CITATION STYLE

APA

Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., & Shi, L. (2023). User-Defined Hand Gesture Interface to Improve User Experience of Learning American Sign Language. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13891 LNCS, pp. 479–490). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-32883-1_43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free