Collaborative integration of speech and 3D gesture for map-based applications

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

QuickSet [6] is a multimodal system that gives users the capability to create and control map-based collaborative interactive simulations by supporting the simultaneous input from speech and pen gestures. In this paper, we report on the augmentation of the graphical pen input enabling the drawings to be formed by 3D hand movements. While pen and mouse can still be used for ink generation drawing can also occur with natural human pointing. To that extent, we use the hand to define a line in space, and consider its possible intersection point with a virtual paper that needs to be determined by the operator as a limited plane surface in the three dimensional space at the begin of the interaction session. The entire system can be seen as a collaborative bodycentered alternative to the traditional mouse-, pen-, or keyboard-based multimodal graphical programs. Its potential applications include battlefield or crisis management, tele-medicine and other types of collaborative decisionmaking during which users can also be mobile. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Corradini, A. (2004). Collaborative integration of speech and 3D gesture for map-based applications. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3038, 913–920. https://doi.org/10.1007/978-3-540-24688-6_117

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free