The illicit diffusion of intimate photographs or videos intended for private use is a troubling phenomenon known as the diffusion of Non-Consensual Intimate Images (NCII). Recently, it has been feared that the spread of deepfake technology, which allows users to fabricate fake intimate images or videos that are indistinguishable from genuine ones, may dramatically extend the scope of NCII. In the present essay, we counter this pessimistic view, arguing for qualified optimism instead. We hypothesize that the growing diffusion of deepfakes will end up disrupting the status that makes our visual experience of photographic images and videos epistemically and affectively special; and that once divested of this status, NCII will lose much of their allure in the eye of the perpetrators, probably resulting in diminished diffusion. We conclude by offering some caveats and drawing some implications to better understand, and ultimately better counter, this phenomenon.
CITATION STYLE
Viola, M., & Voto, C. (2023). Designed to abuse? Deepfakes and the non-consensual diffusion of intimate images. Synthese, 201(1). https://doi.org/10.1007/s11229-022-04012-2
Mendeley helps you to discover research relevant for your work.