Toward Adaptive Trust Calibration for Level 2 Driving Automation

41Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Properly calibrated human trust is essential for successful interaction between humans and automation. However, while human trust calibration can be improved by increased automation transparency, too much transparency can overwhelm human workload. To address this tradeoff, we present a probabilistic framework using a partially observable Markov decision process (POMDP) for modeling the coupled trust-workload dynamics of human behavior in an action-automation context. We specifically consider hands-off Level 2 driving automation in a city environment involving multiple intersections where the human chooses whether or not to rely on the automation. We consider automation reliability, automation transparency, and scene complexity, along with human reliance and eye-gaze behavior, to model the dynamics of human trust and workload. We demonstrate that our model framework can appropriately vary automation transparency based on real-time human trust and workload belief estimates to achieve trust calibration.

Cite

CITATION STYLE

APA

Akash, K., Jain, N., & Misu, T. (2020). Toward Adaptive Trust Calibration for Level 2 Driving Automation. In ICMI 2020 - Proceedings of the 2020 International Conference on Multimodal Interaction (pp. 538–547). Association for Computing Machinery, Inc. https://doi.org/10.1145/3382507.3418885

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free