Interaction history in adaptive multimodal interaction

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Modern Companion-Technologies provide multimodal and adaptive interaction possibilities. However, it is still unclear which user characteristics should be used in which manner to optimally support the interaction. An important aspect is that users themselves learn and adapt their behavior and preferences based on their own experiences. In other words, certain characteristics of user behavior are slowly but continuously changed and updated by the users themselves over multiple encounters with the Companion-Technology. Thus, a biological adaptive multimodal system observes and interacts with an electronic one, and vice versa. Consequently, such a user-centered interaction history is essential and should be integrated in the prediction of user behavior. Doing so enables the Companion to achieve more robust predictions of user behavior, which in turn leads to better fusion decisions and more efficient customization of the UI. We present the development of an experimental paradigm based on visual search tasks. The setup allows the induction of various user experiences as well as the testing of their effects on user behavior and preferences during multimodal interaction.

Cite

CITATION STYLE

APA

Bubalo, N., Schüssel, F., Honold, F., Weber, M., & Huckauf, A. (2017). Interaction history in adaptive multimodal interaction. In Cognitive Technologies (pp. 231–251). Springer Verlag. https://doi.org/10.1007/978-3-319-43665-4_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free