Investigating Bias in Facial Analysis Systems: A Systematic Review

54Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recent studies have demonstrated that most commercial facial analysis systems are biased against certain categories of race, ethnicity, culture, age and gender. The bias can be traced in some cases to the algorithms used and in other cases to insufficient training of algorithms, while in still other cases bias can be traced to insufficient databases. To date, no comprehensive literature review exists which systematically investigates bias and discrimination in the currently available facial analysis software. To address the gap, this study conducts a systematic literature review (SLR) in which the context of facial analysis system bias is investigated in detail. The review, involving 24 studies, additionally aims to identify (a) facial analysis databases that were created to alleviate bias, (b) the full range of bias in facial analysis software and (c) algorithms and techniques implemented to mitigate bias in facial analysis.

Cite

CITATION STYLE

APA

Khalil, A., Ahmed, S. G., Khattak, A. M., & Al-Qirim, N. (2020). Investigating Bias in Facial Analysis Systems: A Systematic Review. IEEE Access, 8, 130751–130761. https://doi.org/10.1109/ACCESS.2020.3006051

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free