Integrating intra and extra gestures into a mobile and multimodal shopping assistant

34Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Wasinger, R., Krüger, A., & Jacobs, O. (2005). Integrating intra and extra gestures into a mobile and multimodal shopping assistant. In Lecture Notes in Computer Science (Vol. 3468, pp. 297–314). Springer Verlag. https://doi.org/10.1007/11428572_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free