Observe, inspect, modify: Three conditions for generative AI governance

6Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In a world increasingly shaped by generative AI systems like ChatGPT, the absence of benchmarks to examine the efficacy of oversight mechanisms is a problem for research and policy. What are the structural conditions for governing generative AI systems? To answer this question, it is crucial to situate generative AI systems as regulatory objects: material items that can be governed. On this conceptual basis, we introduce three high-level conditions to structure research and policy agendas on generative AI governance: industrial observability, public inspectability, and technical modifiability. Empirically, we explicate those conditions with a focus on the EU’s AI Act, grounding the analysis of oversight mechanisms for generative AI systems in their granular material properties as observable, inspectable, and modifiable objects. Those three conditions represent an action plan to help us perceive generative AI systems as negotiable objects, rather than seeing them as mysterious forces that pose existential risks for humanity.

Cite

CITATION STYLE

APA

Ferrari, F., van Dijck, J., & van den Bosch, A. (2023). Observe, inspect, modify: Three conditions for generative AI governance. New Media and Society. https://doi.org/10.1177/14614448231214811

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free