Trust in hybrid human-automated decision-support

8Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.

Abstract

Research has examined trust in humans and trust in automated decision support. Although reflecting a likely realization of decision support in high-risk tasks such as personnel selection, trust in hybrid human-automation teams has thus far received limited attention. In two experiments (N1 = 170, N2 = 154) we compare trust, trustworthiness, and trusting behavior for different types of decision-support (automated, human, hybrid) across two assessment contexts (personnel selection, bonus payments). We additionally examined a possible trust violation by presenting one group of participants a preselection that included predominantly male candidates, thus reflecting possible unfair bias. Whereas fully-automated decisions were trusted less, results suggest that trust in hybrid decision support was similar to trust in human-only support. Trust violations were not perceived differently based on the type of support. We discuss theoretical (e.g., trust in hybrid support) and practical implications (e.g., keeping humans in the loop to prevent negative reactions).

Cite

CITATION STYLE

APA

Kares, F., König, C. J., Bergs, R., Protzel, C., & Langer, M. (2023). Trust in hybrid human-automated decision-support. International Journal of Selection and Assessment, 31(3), 388–402. https://doi.org/10.1111/ijsa.12423

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free