Deciding on Appropriate Use of Force: Human-machine Interaction in Weapons Systems and Emerging Norms

2Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This article considers the role of norms in the debate on autonomous weapons systems (AWS). It argues that the academic and political discussion is largely dominated by considerations of how AWS relate to norms institutionalised in international law. While this debate on AWS has produced insights on legal and ethical norms and sounded options of a possible regulation or ban, it neglects to investigate how complex human-machine interactions in weapons systems can set standards of appropriate use of force, which are politically normatively relevant but take place outside of formal, deliberative law-setting. While such procedural norms are already emerging in the practice of contemporary warfare, the increasing technological complexity of AI-driven weapons will add to their political-normative relevance. I argue that public deliberation about and political oversight and accountability of the use of force is at risk of being consumed and normalised by functional procedures and perceptions. This can have a profound impact on future of remote-warfare and security policy.

Cite

CITATION STYLE

APA

Huelss, H. (2019). Deciding on Appropriate Use of Force: Human-machine Interaction in Weapons Systems and Emerging Norms. Global Policy, 10(3), 354–358. https://doi.org/10.1111/1758-5899.12692

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free