Automated Platform Governance Through Visibility and Scale: On the Transformational Power of AutoModerator

17Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

When platforms use algorithms to moderate content, how should researchers understand the impact on moderators and users? Much of the existing literature on this question views moderation as a series of decision-making tasks and evaluates moderation algorithms based on their accuracy. Drawing on literature from the field of platform governance, I argue that content moderation is more than a series of discrete decisions but rather a complex system of rules, mechanism, and procedures. Research must therefore articulate how automated moderation alters the broader regime of governance on a platform. To demonstrate this, I report on the findings of a qualitative study on the Reddit bot AutoModerator, using interviews and trace ethnography. I find that the scale of the bot allows moderators to carefully manage the visibility of content and content moderation on Reddit, fundamentally transforming the basic rules of governance on the platform.

Cite

CITATION STYLE

APA

Wright, L. (2022). Automated Platform Governance Through Visibility and Scale: On the Transformational Power of AutoModerator. Social Media and Society, 8(1). https://doi.org/10.1177/20563051221077020

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free