Affirmative Algorithms: Relational Equality as Algorithmic Fairness

9Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many statistical fairness notions have been proposed for algorithmic decision-making systems, and especially public safety pretrial risk assessment (PSPRA) algorithms such as COMPAS. Most fairness notions equalize something between groups, whether it is false positive rates or accuracy. In fact, I demonstrate that most prominent notions have their basis in equalizing some form of accuracy. However, statistical fairness metrics often do not capture the substantive point of equality. I argue that equal accuracy is not only difficult to measure but also unsatisfactory for ensuring equal justice. In response, I introduce philosopher Elizabeth Anderson's theory of relational equality as a fruitful alternative framework: to relate as equals, people need access to certain basic capabilities. I show that relational equality requires Affirmative PSPRA algorithms that lower risk scores for Black defendants. This is because fairness based on relational equality means considering the impact of PSPRA algorithms' decisions on access to basic capabilities. This impact is racially asymmetric in an unjust society. I make three main contributions: (1) I illustrate the shortcomings of statistical fairness notions in their reliance on equalizing some form of accuracy; (2) I present the first comprehensive ethical defense of Affirmative PSPRA algorithms, based on fairness in terms of relational equality instead; and (3) I show that equalizing accuracy is neither sufficient nor necessary for fairness based on relational equality. Overall, this work serves narrowly as a reason to re-evaluate algorithmic fairness for PSPRA algorithms, and serves broadly as an example of how discussions of algorithmic fairness can benefit from egalitarian philosophical frameworks.

Cite

CITATION STYLE

APA

Zhang, M. (2022). Affirmative Algorithms: Relational Equality as Algorithmic Fairness. In ACM International Conference Proceeding Series (pp. 495–507). Association for Computing Machinery. https://doi.org/10.1145/3531146.3533115

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free