Behind the Screen: Content Moderation in the Shadows of Social Media

387Citations
Citations of this article
174Readers
Mendeley users who have this article in their library.
Get full text

Abstract

An eye-opening look at the invisible workers who protect us from seeing humanity’s worst on today’s commercial internet. Social media on the internet can be a nightmarish place. A primary shield against hateful language, violent videos, and online cruelty uploaded by users is not an algorithm. It is people. Mostly invisible by design, more than 100,000 commercial content moderators evaluate posts on mainstream social media platforms: enforcing internal policies, training artificial intelligence systems, and actively screening and removing offensive material—sometimes thousands of items per day. Sarah T. Roberts, an award-winning social media scholar, offers the first extensive ethnographic study of the commercial content moderation industry. Based on interviews with workers from Silicon Valley to the Philippines, at boutique firms and at major social media companies, she contextualizes this hidden industry and examines the emotional toll it takes on its workers. This revealing investigation of the people “behind the screen” offers insights into not only the reality of our commercial internet but the future of globalized labor in the digital age.

Cite

CITATION STYLE

APA

Roberts, S. T. (2019). Behind the Screen: Content Moderation in the Shadows of Social Media. Behind the Screen: Content Moderation in the Shadows of Social Media (pp. 1–266). Yale University Press. https://doi.org/10.1177/1461444819878844

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free