Trusting autonomous vehicles as moral agents improves related policy support

5Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Compared to human-operated vehicles, autonomous vehicles (AVs) offer numerous potential benefits. However, public acceptance of AVs remains low. Using 4 studies, including 1 preregistered experiment (total N = 3,937), the present research examines the role of trust in AV adoption decisions. Using the Trust-Confidence-Cooperation model as a conceptual framework, we evaluate whether perceived integrity of technology—a previously underexplored dimension of trust that refers to perceptions of the moral agency of a given technology—influences AV policy support and adoption intent. We find that perceived technology integrity predicts adoption intent for AVs and that messages that increase perceived integrity of AV technology result in greater AV adoption intent and policy support. This knowledge can be used to guide communication efforts aimed at increasing public trust in AVs, and ultimately enhance integration of AVs into transport systems.

Cite

CITATION STYLE

APA

Hurst, K. F., & Sintov, N. D. (2022). Trusting autonomous vehicles as moral agents improves related policy support. Frontiers in Psychology, 13. https://doi.org/10.3389/fpsyg.2022.976023

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free