Explanations as governance? Investigating practices of explanation in algorithmic system design

3Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The algorithms underpinning many everyday communication processes are now complex enough that rendering them explainable has become a key governance objective. This article examines the question of 'who should be required to explain what, to whom, in platform environments'. By working with algorithm designers and using design methods to extrapolate existing capacities to explain aglorithmic functioning, the article discusses the power relationships underpinning explanation of algorithmic function. Reviewing how key concepts of transparency and accountability connect with explainability, the paper argues that reliance on explainability as a governance mechanism can generate a dangerous paradox which legitimates increased reliance on programmable infrastructure as expert stakeholders are reassured by their ability to perform or receive explanations, while displacing responsibility for understandings of social context and definitions of public interest

Cite

CITATION STYLE

APA

Powell, A. B. (2021). Explanations as governance? Investigating practices of explanation in algorithmic system design. European Journal of Communication, 36(4), 362–375. https://doi.org/10.1177/02673231211028376

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free