Generational differences in trust in digital assistants

7Citations
Citations of this article
33Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Human trust in automation has been studied extensively within safety critical domains (military, aviation, process control, etc.) because harmful consequences are associated with the improper calibration of trust in automated systems in these domains (Parasuraman & Riley, 1997). As such, researchers have worked to identify important factors which help humans build trust in such systems (Hoff & Bashir, 2015). With the explosion of AI in consumer technologies, it is becoming equally critical to understand how humans interact with everyday devices. This study investigated how factors that have been identified to impact trust in automation in safety critical domains influence the trust and use of popular digital assistants (Siri, Cortana, Bixby or Google Now). We conducted an online survey with 278 regular users of digital assistants across three generations (GenX, GenY, and GenZ). The results demonstrate that, even after controlling for dispositional factors (i.e., individual characteristics such as age, culture, gender), GenZ exhibited higher trust in digital assistants than GenX. More interestingly, linear regression analyses revealed a different set of predictors of trust for each generation. Results from this survey have implications for the design of digital assistants.

Cite

CITATION STYLE

APA

Noah, B., & Sethumadhavan, A. (2019). Generational differences in trust in digital assistants. In Proceedings of the Human Factors and Ergonomics Society (Vol. 63, pp. 206–210). SAGE Publications Inc. https://doi.org/10.1177/1071181319631029

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free