Professional content moderators are responsible for limiting the negative effects of online discussions on news platforms and social media. However, little is known about how they adjust to platform and company moderation strategies while viewing and dealing with uncivil comments. Using qualitative interviews (N = 18), this study examines which types of comments professional moderators classify as actionable, which (automated) strategies they use to moderate them, and how these perceptions and strategies differ between organizations, platforms, and individuals. Our results show that moderators divide content requiring intervention into clearly problematic and “gray area” comments. They (automatically) delete clear cases but use interactive or motivational moderation techniques for “gray areas.” While moderators crave more advanced technologies, they deem them incapable of addressing context-heavy comments. These findings highlight the need for nuanced regulations, emphasize the crucial role of moderators in shaping public discourse, and offer practical implications for (semi-)automated content moderation strategies.
CITATION STYLE
Stockinger, A., Schäfer, S., & Lecheler, S. (2023). Navigating the gray areas of content moderation: Professional moderators’ perspectives on uncivil user comments and the role of (AI-based) technological tools. New Media and Society. https://doi.org/10.1177/14614448231190901
Mendeley helps you to discover research relevant for your work.