Feminized AIs designed for in-home verbal assistance are often subjected to gendered verbal abuse by their users. I survey a variety of features contributing to this phenomenon—from financial incentives for businesses to build products likely to provoke gendered abuse, to the impact of such behavior on household members—and identify a potential worry for attempts to criticize the phenomenon; while critics may be tempted to argue that engaging in gendered abuse of AI increases the chances that one will direct this abuse toward human beings, the recent history of attempts to connect video game violence to real-world aggression suggests that things may not be so simple. I turn to Confucian discussions of the role of ritualized social interactions both to better understand the roots of the problem and to investigate potential strategies for improvement, given a complex interplay between designers and device users. I argue that designers must grapple with the entrenched sexism in our society, at the expense of “smooth” and “seamless” user interfaces, in order to intentionally disrupt entrenched but harmful patterns of interaction, but that doing so is both consistent with and recommended by Confucian accounts of social rituals.
CITATION STYLE
Elder, A. (2022). Siri, Stereotypes, and the Mechanics of Sexism. Feminist Philosophy Quarterly, 8(3/4). https://doi.org/10.5206/fpq/2022.3/4.14294
Mendeley helps you to discover research relevant for your work.