Culture and the safety of complex automated sociotechnical systems

7Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Sociotechnical systems are becoming more complex and increasingly automated. Although human error is now widely viewed as playing a key role in the majority of system failures, there is an increasing recognition of the oversimplification inherent in such a view. This paper examines mismatches between the procedures and automation technologies of sociotechnical systems and their operators from the viewpoint of human culture and capabilities, with a particular focus on flight deck automation. Following an introduction to culture, its sources, its measurement, and its effects, this paper describes recent theories of thinking and decision making, and the influence of culture on decisions. Problems that are associated with automation are presented and it is concluded that current automation systems perform as very inadequate team members, leaving the human operators or crew unprepared when failure occurs or unusual events arise. © 2013 IEEE.

Cite

CITATION STYLE

APA

Hodgson, A., Siemieniuch, C. E., & Hubbard, E. M. (2013). Culture and the safety of complex automated sociotechnical systems. IEEE Transactions on Human-Machine Systems, 43(6), 608–619. https://doi.org/10.1109/THMS.2013.2285048

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free