How to deal with an AI near-miss: Look to the skies

4Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

AI systems are harming people. Harms such as discrimination and manipulation are reported in the media, which is the primary source of information on AI incidents. Reporting AI near-misses and learning from how a serious incident was prevented would help avoid future incidents. The problem is that ongoing efforts to catalog AI incidents rely on media reports—which does not prevent incidents. Developers, designers, and deployers of AI systems should be incentivized to report and share information on near misses. Such an AI near-miss reporting system does not have to be designed from scratch; the aviation industry’s voluntary, confidential, and non-punitive approach to such reporting can be used as a guide. AI incidents are accumulating, and the sooner such a near-miss reporting system is established, the better.

Cite

CITATION STYLE

APA

Shrishak, K. (2023). How to deal with an AI near-miss: Look to the skies. Bulletin of the Atomic Scientists, 79(3), 166–169. https://doi.org/10.1080/00963402.2023.2199580

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free