The rise in artificial intelligence capabilities in autonomy-enabled systems and robotics has pushed research to address the unique nature of human-autonomy team collaboration. The goal of these advanced technologies is to enable rapid decision-making, enhance situation awareness, promote shared understanding, and improve team dynamics. Simultaneously, use of these technologies is expected to reduce risk to those who collaborate with these systems. Yet, for appropriate human-autonomy teaming to take place, especially as we move beyond dyadic partnerships, proper calibration of team trust is needed to effectively coordinate interactions during high-risk operations. But to meet this end, critical measures of team trust for this new dynamic of human-autonomy teams are needed. This article seeks to expand on trust measurement principles and the foundation of human-autonomy teaming to propose a "toolkit" of novel methods that support the development, maintenance, and calibration of trust in human-autonomy teams operating within uncertain, risky, and dynamic environments.
CITATION STYLE
Krausman, A., Neubauer, C., Forster, D., Lakhmani, S., Baker, A. L., Fitzhugh, S. M., … Schaefer, K. E. (2022). Trust Measurement in Human-Autonomy Teams: Development of a Conceptual Toolkit. ACM Transactions on Human-Robot Interaction, 11(3). https://doi.org/10.1145/3530874
Mendeley helps you to discover research relevant for your work.