Abstract
Humans interpret and predict others’ behaviors by ascribing intentions or beliefs, or in other words, by adopting the intentional stance. Since artificial agents are increasingly populating our daily environments, the question arises whether (and under which conditions) humans would apply the “human model” to understand the behaviors of these new social agents. Thus, in a series of three experiments, we tested whether embedding humans in a social interaction with a humanoid robot either displaying a human-like or machine-like behavior would modulate their initial tendency to adopt the intentional stance. Results showed that indeed humans are more prone to adopt the intentional stance after having interacted with a more socially available and human-like robot, while no modulation of the adoption of the intentional stance emerged toward a mechanistic robot. We conclude that short experiences with humanoid robots presumably inducing a “like-me” impression and social bonding increase the likelihood of adopting the intentional stance.
Author supplied keywords
Cite
CITATION STYLE
Marchesi, S., De Tommaso, D., Perez-Osorio, J., & Wykowska, A. (2022). Belief in Sharing the Same Phenomenological Experience Increases the Likelihood of Adopting the Intentional Stance Toward a Humanoid Robot. Technology, Mind, and Behavior, 3(3). https://doi.org/10.1037/tmb0000072
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.