Making decisions; Bias in artificial intelligence and data-driven diagnostic tools

17Citations
Citations of this article
45Readers
Mendeley users who have this article in their library.

Abstract

Background Although numerous studies have shown the potential of artificial intelligence (AI) systems in drastically improving clinical practice, there are concerns that these AI systems could replicate existing biases. Objective This paper provides a brief overview of ‘algorithmic bias’, which refers to the tendency of some AI systems to perform poorly for disadvantaged or marginalised groups. Discussion AI relies on data generated, collected, recorded and labelled by humans. If AI systems remain unchecked, whatever biases that exist in the real world that are embedded in data will be incorporated into the AI algorithms. Algorithmic bias can be considered as an extension, if not a new manifestation, of existing social biases, understood as negative attitudes towards or the discriminatory treatment of some groups. In medicine, algorithmic bias can compromise patient safety and risks perpetuating disparities in care and outcome. Thus, clinicians should consider the risk of bias when deploying AI-enabled tools in their practice.

References Powered by Scopus

Racial bias in pulse oximetry measurement

629Citations
N/AReaders
Get full text

Bias in data-driven artificial intelligence systems—An introductory survey

574Citations
N/AReaders
Get full text

Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis

362Citations
N/AReaders
Get full text

Cited by Powered by Scopus

International pharmacy students' perceptions towards artificial intelligence in medicine—A multinational, multicentre cross-sectional study

21Citations
N/AReaders
Get full text

Ethical Framework for Harnessing the Power of AI in Healthcare and Beyond

15Citations
N/AReaders
Get full text

Balancing the scale: navigating ethical and practical challenges of artificial intelligence (AI) integration in legal practices

11Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Aquino, Y. S. J. (2023). Making decisions; Bias in artificial intelligence and data-driven diagnostic tools. Australian Journal of General Practice, 52(7), 439–442. https://doi.org/10.31128/AJGP-12-22-6630

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 6

43%

Lecturer / Post doc 3

21%

Researcher 3

21%

Professor / Associate Prof. 2

14%

Readers' Discipline

Tooltip

Medicine and Dentistry 4

44%

Psychology 2

22%

Business, Management and Accounting 2

22%

Decision Sciences 1

11%

Article Metrics

Tooltip
Mentions
References: 1
Social Media
Shares, Likes & Comments: 2

Save time finding and organizing research with Mendeley

Sign up for free