competent third parties and content moderation on platforms: Potentials of Independent Decision-Making Bodies from A Governance Structure Perspective

9Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

After many years of much-criticized opacity in the field of content moderation, social media platforms are now opening up to a dialogue with users and policymakers. Until now, liability frameworks in the United States and in the European Union (EU) have set incentives for platforms not to monitor user-generated content-an increasingly contested model that has led to (inter alia) practices and policies of noncontainment. Following discussions on platform power over online speech and how contentious content benefits the attention economy, there is an observable shift toward stricter content moderation duties in addition to more responsibility with regard to content. Nevertheless, much remains unsolved: the legitimacy of platforms' content moderation rules and decisions is still questioned. The platforms' power over the vast majority of communication in the digital sphere is still difficult to grasp because of its nature as private, yet often perceived as public. To address this issue, we use a governance structure perspective to identify potential regulatory advantages of establishing cross-platform external bodies for content moderation, ultimately aiming at providing insights about the opportunities and limitations of such a model.

Cite

CITATION STYLE

APA

Heldt, A., & Dreyer, S. (2021). competent third parties and content moderation on platforms: Potentials of Independent Decision-Making Bodies from A Governance Structure Perspective. Journal of Information Policy. Penn State University Press. https://doi.org/10.5325/jinfopoli.11.2021.0266

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free