Mitigating Demographic Bias in AI-based Resume Filtering

35Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With increasing diversity in the labor market as well as the work force, employers receive resumes from an increasingly diverse population. However, studies and field experiments have confirmed the presence of bias in the labor market based on gender, race, and ethnicity. Many employers use automated resume screening to filter the many possible matches. Depending on how the automated screening algorithm is trained it can potentially exhibit bias towards a particular population by favoring certain socio-linguistic characteristics. The resume writing style and socio-linguistics are a potential source of bias as they correlate with protected characteristics such as ethnicity. A biased dataset is often translated into biased AI algorithms and de-biasing algorithms are being contemplated. In this work, we study the effects of socio-linguistic bias on resume to job description matching algorithms. We develop a simple technique, called fair-tf-idf, to match resumes with job descriptions in a fair way by mitigating the socio-linguistic bias.

Cite

CITATION STYLE

APA

Deshpande, K. V., Pan, S., & Foulds, J. R. (2020). Mitigating Demographic Bias in AI-based Resume Filtering. In UMAP 2020 Adjunct - Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization (pp. 268–275). Association for Computing Machinery, Inc. https://doi.org/10.1145/3386392.3399569

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free