Measuring relational trust in human-robot interactions

5Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

Trust is an integral part of almost any human-robot interaction (HRI). The most technologically advanced robot can sit unused if a human interactant does not trust it. Conversely, a robot that is overly trusted may be assumed to be more advanced than it truly is, resulting in over-reliance on an imperfect system [8]. One of the most widely used definitions for trust in HRI is that trust is "the attitude that an agent will help achieve an individual's goals in a situation characterized by uncertainty and vulnerability." [6]. As robots initially emerged into society in factory roles, their goals were clear and their performance was concretely measurable with metrics such as time to completion, number of errors in a given behavior, and behavior consistency over time. The humans who worked with them, therefore, could base their trust in the robots on how effectively they achieved these clearly defined goals.

Cite

CITATION STYLE

APA

Law, T. (2020). Measuring relational trust in human-robot interactions. In ACM/IEEE International Conference on Human-Robot Interaction (pp. 579–581). IEEE Computer Society. https://doi.org/10.1145/3371382.3377435

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free