Auditing Algorithmic Bias on Twitter

33Citations
Citations of this article
44Readers
Mendeley users who have this article in their library.

Abstract

Digital media platforms are reshaping our habits, how we access information, and how we interact with others. As a result, algorithms used by platforms, for example, to recommend content, play an increasingly important role in our access to information. Due to practical difficulties of accessing how platforms present content to their users, relatively little is known about how recommendation algorithms affect the information people receive. In this paper we implement a sock-puppet audit, a computational framework to audit black-box social media systems so as to quantify the impact of algorithmic curation on the information people see. We evaluate this framework by conducting a study on Twitter. We demonstrate that Twitter's timeline curation algorithms skew the popularity and novelty of content people see and increase the inequality of their exposure to friends' tweets. Our work provides evidence that algorithmic curation of content systematically distorts the information people see.

Cite

CITATION STYLE

APA

Bartley, N., Abeliuk, A., Ferrara, E., & Lerman, K. (2021). Auditing Algorithmic Bias on Twitter. In ACM International Conference Proceeding Series (pp. 65–73). Association for Computing Machinery. https://doi.org/10.1145/3447535.3462491

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free