Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.
CITATION STYLE
Huang, H. Y., Twidale, M., & Bashir, M. (2020). ‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective. In Advances in Intelligent Systems and Computing (Vol. 1038, pp. 994–1013). Springer Verlag. https://doi.org/10.1007/978-3-030-29513-4_73
Mendeley helps you to discover research relevant for your work.