Examining trust and reliance in collaborations between humans and automated agents

30Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.

Abstract

Human trust and reliance in artificial agents is critical to effective collaboration in mixed human computer teams. Understanding the conditions under which humans trust and rely upon automated agent recommendations is important as trust is one of the mechanisms that allow people to interact effectively with a variety of teammates. We conducted exploratory research to investigate how personality characteristics and uncertainty conditions affect human-machine interactions. Participants were asked to determine if two images depicted the same or different people, while simultaneously considering the recommendation of an automated agent. Results of this effort demonstrated a correlation between judgements of agent expertise and user trust. In addition, we found that in conditions of high and low uncertainty, the decision outcomes of participants moved significantly in the direction of the agent’s recommendation. Differences in reported trust in the agent were observed in individuals with low and high levels of extraversion.

Cite

CITATION STYLE

APA

Elson, J. S., Derrick, D. C., & Ligon, G. S. (2018). Examining trust and reliance in collaborations between humans and automated agents. In Proceedings of the Annual Hawaii International Conference on System Sciences (Vol. 2018-January, pp. 430–439). IEEE Computer Society. https://doi.org/10.24251/hicss.2018.056

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free