Most accounts of responsibility focus on one type of responsibility, moral responsibility, or address one particular aspect of moral responsibility such as agency. This article outlines a broader framework to think about responsibility that includes causal responsibility, relational responsibility, and what I call “narrative responsibility” as a form of “hermeneutic responsibility”, connects these notions of responsibility with different kinds of knowledge, disciplines, and perspectives on human being, and shows how this framework is helpful for mapping and analysing how artificial intelligence (AI) challenges human responsibility and sense-making in various ways. Mobilizing recent hermeneutic approaches to technology, the article argues that next to, and interwoven with, other types of responsibility such as moral responsibility, we also have narrative and hermeneutic responsibility—in general and for technology. For example, it is our task as humans to make sense of, with and, if necessary, against AI. While from a posthumanist point of view, technologies also contribute to sense-making, humans are the experiencers and bearers of responsibility and always remain in charge when it comes to this hermeneutic responsibility. Facing and working with a world of data, correlations, and probabilities, we are nevertheless condemned to make sense. Moreover, this also has a normative, sometimes even political aspect: acknowledging and embracing our hermeneutic responsibility is important if we want to avoid that our stories are written elsewhere—through technology.
CITATION STYLE
Coeckelbergh, M. (2023). Narrative responsibility and artificial intelligence: How AI challenges human responsibility and sense-making. AI and Society, 38(6), 2437–2450. https://doi.org/10.1007/s00146-021-01375-x
Mendeley helps you to discover research relevant for your work.