Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism , by Safiya Umoja Noble

  • Iverson M
N/ACitations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Safiya Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism argues that current approaches to evaluating, studying, and contextualizing the social effects of algorithms do little to address how they promote the expansion and reproduction of existing racist and sexist modes of power. Noble uses black feminist theory and critical race studies to orient information sciences and algorithmic studies toward an agenda that centers how algorithms affect the ability of marginalized communities to fight race and gender-based oppression. In doing so, Noble’s work lifts up the voices and experiences of communities who are systematically excluded or misrepresented in how search engines manage society’s access to, and engagement with, information. Noble’s study focuses mainly on Google’s monopoly over our information infrastructure and economy. While she also addresses corporate users of search like Yelp, and public users like the Library of Congress, her singular focus on Google allows readers a clear picture of the risks posed in giving corporations and institutions the power to manage how we search and find information on and off the Internet. Noble’s commitment to changing our relationship with the companies and institutions that provide access to the world’s information is a call for scholars, the tech industry, policy experts, lawmakers, and the general public to trouble the connection between commercial interests and the algorithms that help us search for information that contextualizes our social experiences. In presenting algorithms and search as integral to how we acquire knowledge, Noble aims to make readers more conscientious about how capital influences people’s access to reliable, relevant, and responsible sources of information. Although her primary aim is to demonstrate how commercial search engines reinforce systematic oppression against black Americans (what she calls “technological redlining” [p. 1]), she often expands her framework to include other marginalized communities such as non–black American women, other communities of color, and people who practice religions other than Christianity. By doing so, she demonstrates the wide-reaching costs of erasing the social power of algorithms for communities who rarely have a say in how they are categorized and indexed through search. Noble begins her work by challenging the belief that algorithms are neutral and unbiased managers of information. For her, seeing algorithms in this manner obscures the programmers responsible for writing the algorithm’s code, as well as the commercial interests using algorithms to their advantage. Noble is not the first scholar to write about the biased nature of search engines (Itrona and Nissenbaum 2000). Nor is she the first author to use Google as a primary case study for the dangers that information monopolies pose to the free and equal exchange of information (Vaidhyanathan 2011). However, her perspective doesn’t just bring awareness to how Google’s control manipulates our information economy. She also draws attention to how search results are indicators for how we unequally come to value, care, and know about different groups of people. Specifically, Noble disputes Google’s assertion that racist and sexist search results stem from glitches in the algorithm’s code. Instead, she contests that “glitches” in Google searches for terms such as “black girls,” “professor,” or “criminal” provide evidence that Google’s code is embedded within the same human epistemologies for race and gender that devalue the experiences of communities not represented in the company’s mainly white and male workforce. While Google invest in projects dedicated to increasing the number of minorities in tech, Noble resists such measures as a panacea for how programmers unconsciously encode bias into search. She sees investing in such future-oriented projects as indicative of Google’s failure to address their present hiring practices that bypass pools of qualified programmers from marginalized communities. Moreover, as the mostly white and male programmers at Google are not expected to learn to create algorithms that consider the importance of representation for marginalized communities, the company passes the responsibility of addressing bias in algorithms on to future generations of marginalized programmers, while asking little or nothing of their current workforce.

Cite

CITATION STYLE

APA

Iverson, M. (2020). Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism , by Safiya Umoja Noble. Television & New Media, 21(5), 547–551. https://doi.org/10.1177/1527476419847320

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free