Abstract
There is great interest in whether machine learning algorithms deployed in consequential domains (e.g. in criminal justice) treat different demographic groups "fairly." However, there are several proposed notions of fairness, typically mutually incompatible. Using criminal justice as an example, we study a model in which society chooses an incarceration rule. Agents of different demographic groups differ in their outside options (e.g. opportunity for legal employment) and decide whether to commit crimes. We show that equalizing type I and type II errors across groups is consistent with the goal of minimizing the overall crime rate; other popular notions of fairness are not.
Author supplied keywords
Cite
CITATION STYLE
Jung, C., Kannan, S., Lee, C., Pai, M., Roth, A., & Vohra, R. (2020). Fair Prediction with Endogenous Behavior. In EC 2020 - Proceedings of the 21st ACM Conference on Economics and Computation (pp. 677–678). Association for Computing Machinery. https://doi.org/10.1145/3391403.3399473
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.