Effects of Proactive Dialogue Strategies on Human-Computer Trust

59Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Intelligent computer systems aim at providing user-assistance for challenging tasks, like decision-making, planning, or learning. For offering optimal assistance, it is essential for such systems to decide when to be reactive or proactive and how active system behaviour should be designed. Especially, as this decision may greatly influence the user's trust in the system. Therefore, we conducted a mixed-factorial study which examines how different levels of proactivity (none, notification, suggestion, and intervention) as well as timing strategies (fixed-timing and insecurity-based) are trusted by subjects while performing a planning task. The results showed, that proactive system behaviour is perceived trustworthy in insecure situations independent of the timing. However, proactive dialogue showed strong effects on cognition-based trust (system's perceived competence and reliability) depending on task difficulty. Furthermore, fully autonomous system behaviour fails to establish an adequate human-computer trust relationship, in contrast to conservative strategies.

Cite

CITATION STYLE

APA

Kraus, M., Wagner, N., & Minker, W. (2020). Effects of Proactive Dialogue Strategies on Human-Computer Trust. In UMAP 2020 - Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization (pp. 107–116). Association for Computing Machinery, Inc. https://doi.org/10.1145/3340631.3394840

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free