Image-guided military operations embed soldiers into a complex system of image production, transmission, and perception. These soldiers separate their bodies from the battlefield, but they also mediate between them. In particular, remote controlled operations of so-called unmanned aerial systems (UAS) require the synchronization between human actors and technical sensors in real-time, such as the knowledge of a situation. This situational awareness relies almost exclusively on the visualization of sensory data. This human-machine entanglement corresponds to a new operative modality of images which differs from previous forms of real-time imaging such as live broadcasting, as it is based on a feedback-loop that turns the observer into an actor. Images are not simply analyzed and interpreted but become agents in a socio- technological assemblage. The paper will draw upon this functional shift of images from a medium of visualization towards a medium that guides operative processes. Based on the analysis of vision, architecture, and navigation in remote warfare, it will discuss how real-time video technology and the mobilization of sensor and transmission technology produce a type of intervention, in which action and perception is increasingly organized and determined by machines.
CITATION STYLE
Queisner, M. (2017). ’Looking Through a Soda Straw’: Mediated Vision in Remote Warfare. Politik, 20(1). https://doi.org/10.7146/politik.v20i1.27644
Mendeley helps you to discover research relevant for your work.