Synthetic animation of deaf signing gestures

31Citations
Citations of this article
34Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe a method for automatically synthesizing deaf signing animations from a high-level description of signs in terms of the HamNoSys transcription system. Lifelike movement is achieved by combining a simple control model of hand movement with inverse kinematic calculations for placement of the arms. The realism can be further enhanced by mixing the synthesized animation with motion capture data for the spine and neck, to add natural “ambient motion”.

Cite

CITATION STYLE

APA

Kennaway, R. (2002). Synthetic animation of deaf signing gestures. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2298, pp. 146–157). Springer Verlag. https://doi.org/10.1007/3-540-47873-6_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free