Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
Wasinger, R., Krüger, A., & Jacobs, O. (2005). Integrating intra and extra gestures into a mobile and multimodal shopping assistant. In Lecture Notes in Computer Science (Vol. 3468, pp. 297–314). Springer Verlag. https://doi.org/10.1007/11428572_18
Mendeley helps you to discover research relevant for your work.