Fitting the description: historical and sociotechnical elements of facial recognition and anti-black surveillance

27Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

It is increasingly evident that if researchers and policymakers want to meaningfully develop an understanding of responsible innovation, we must first ask whether some sociotechnical systems should be developed, at all. Here I argue that systems like facial recognition, predictive policing, and biometrics are predicated on myriad human prejudicial biases and assumptions which must be named and interrogated prior to any innovation. Further, the notions of individual responsibility inherent in discussions of technological ethics and fairness overburden marginalized peoples with a demand to prove the reality of their marginalization. Instead, we should focus on equity and justice, valuing the experiential knowledge of marginalized peoples and optimally positioning them to enact deep, lasting change. My position aligns with those in Science, Technology, and Society (STS) which center diverse and situated knowledges, and is articulated together with calls for considering within science and engineering wider sociocultural concerns like justice and equality.

Cite

CITATION STYLE

APA

Williams, D. P. (2020). Fitting the description: historical and sociotechnical elements of facial recognition and anti-black surveillance. Journal of Responsible Innovation, 7(S1), 74–83. https://doi.org/10.1080/23299460.2020.1831365

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free