Algorithmic Bias in Recidivism Prediction: A Causal Perspective

ArXiv: 1911.10640
11Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

ProPublica's analysis of recidivism predictions produced by Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software tool for the task, has shown that the predictions were racially biased against African American defendants. We analyze the COMPAS data using a causal reformulation of the underlying algorithmic fairness problem. Specifically, we assess whether COMPAS exhibits racial bias against African American defendants using FACT, a recently introduced causality grounded measure of algorithmic fairness. We use the Neyman-Rubin potential outcomes framework for causal inference from observational data to estimate FACT from COMPAS data. Our analysis offers strong evidence that COMPAS exhibits racial bias against African American defendants. We further show that the FACT estimates from COMPAS data are robust in the presence of unmeasured confounding.

Cite

CITATION STYLE

APA

Khademi, A., & Honavar, V. (2020). Algorithmic Bias in Recidivism Prediction: A Causal Perspective. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 13839–13840). AAAI press.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free