A complete system for the specification and the generation of sign language gestures

18Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes a system called GeSsyCa which is able to produce synthetic sign language gestures from a high level specification. This specification is made with a language based both on a discrete description of space, and on a movement decomposition inspired from sign language gestures. Communication gestures are represented through symbolic commands which can be described by qualitative data, and traduced in terms of spatio-temporal targets driving a generation system. Such an approach is possible for the class of generation models controlled through key-points information. The generation model used in our approach is composed of a set of sensori-motor servo-loops. Each of these models resolves in real time the inversion of the servo-loop, from the direct specification of location targets, while satisfying psycho-motor laws of biological movement. The whole control system is applied to the synthesis of communication and sign language gestures, and a validation of the synthesized movements is presented.

Cite

CITATION STYLE

APA

Lebourque, T., & Gibet, S. (1999). A complete system for the specification and the generation of sign language gestures. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1739, pp. 227–238). Springer Verlag. https://doi.org/10.1007/3-540-46616-9_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free