Biased Face Recognition Technology Used by Government: A Problem for Liberal Democracy

29Citations
Citations of this article
97Readers
Mendeley users who have this article in their library.

Abstract

This paper presents a novel philosophical analysis of the problem of law enforcement’s use of biased face recognition technology (FRT) in liberal democracies. FRT programs used by law enforcement in identifying crime suspects are substantially more error-prone on facial images depicting darker skin tones and females as compared to facial images depicting Caucasian males. This bias can lead to citizens being wrongfully investigated by police along racial and gender lines. The author develops and defends “A Liberal Argument Against Biased FRT,” which concludes that law enforcement use of biased FRT is inconsistent with the classical liberal requirement that government treat all citizens equally before the law. Two objections to this argument are considered and shown to be unsound. The author concludes by suggesting that equality before the law should be preserved while the problem of machine bias ought to be resolved before FRT and other types of artificial intelligence (AI) are deployed by governments in liberal democracies.

Cite

CITATION STYLE

APA

Gentzel, M. (2021). Biased Face Recognition Technology Used by Government: A Problem for Liberal Democracy. Philosophy and Technology, 34(4), 1639–1663. https://doi.org/10.1007/s13347-021-00478-z

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free