Because robots are perceived as moral agents, they hold significant persuasive power over humans. It is thus crucial for robots to behave in accordance with human systems of morality and to use effective strategies for human-robot moral communication. In this work, we evaluate two moral communication strategies: a norm-based strategy grounded in deontological ethics, and a role-based strategy grounded in role ethics, in order to test the effectiveness of these two strategies in encouraging compliance with norms grounded in role expectations. Our results suggest two major findings: (1) reflective exercises may increase the efficacy of role-based moral language and (2) opportunities for moral practice following robots' use of moral language may facilitate role-centered moral cultivation.
CITATION STYLE
Wen, R., Kim, B., Phillips, E., Zhu, Q., & Williams, T. (2021). Comparing strategies for robot communication of role-grounded moral norms. In ACM/IEEE International Conference on Human-Robot Interaction (pp. 323–327). IEEE Computer Society. https://doi.org/10.1145/3434074.3447185
Mendeley helps you to discover research relevant for your work.