Justification and Transparency Explanations in Dialogue Systems to Maintain Human-Computer Trust

  • Nothdurft F
  • Minker W
N/ACitations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes a web-based study testing the effects of different explanations on the human-computer trust relationship. Human-computer trust has shown to be very important in keeping the user motivated and cooperative in a human-computer interaction. Especially unexpected or not understandable situations may decrease the trust and by that the way of interacting with a technical system. Analogous to human-human interaction providing explanations in these situations can help to remedy negative effects. However, selecting the appropriate explanation based on users' human-computer trust is an unprecedented approach because existing studies concentrate on trust as a one-dimensional concept. In this study we try to find a mapping between the bases of trust and the different goals of explanations. Our results show that transparency explanations seem to be the best way to influence the user's perceived understandability and reliability.

Cite

CITATION STYLE

APA

Nothdurft, F., & Minker, W. (2016). Justification and Transparency Explanations in Dialogue Systems to Maintain Human-Computer Trust (pp. 41–50). https://doi.org/10.1007/978-3-319-21834-2_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free