Race, Gender, and Age Biases in Biomedical Masked Language Models

6Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Biases cause discrepancies in healthcare services. Race, gender, and age of a patient affect interactions with physicians and the medical treatments one receives. These biases in clinical practices can be amplified following the release of pre-trained language models trained on biomedical corpora. To bring awareness to such repercussions, we examine social biases present in the biomedical masked language models. We curate prompts based on evidence-based practice and compare generated diagnoses based on biases. For a case study, we measure bias in diagnosing coronary artery disease and using cardiovascular procedures based on bias. Our study demonstrates that biomedical models are less biased than BERT in gender, while the opposite is true for race and age.

Cite

CITATION STYLE

APA

Kim, M. Y. J., Kim, J., & Johnson, K. M. (2023). Race, Gender, and Age Biases in Biomedical Masked Language Models. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 11806–11815). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.749

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free