Dehumanization: Is there a legal problem under article 36?

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

While remote attack, whether using remote piloting, autonomous attack technology or cyber techniques, does not per se raise legal issues, there is a clear ethical dimension. People are nevertheless closely involved, fulfilling various critical roles. All forms of mechanical learning associated with attack technologies are not unacceptable. Consider, for example, learning methods integrated into a weapon system that are designed to increase or ensure victim protection. Ethical concerns will, however, persist and will be associated with concerns that machines should not be permitted to decide who is to be attacked and who is to be spared. Zero casualty warfare is not as such unlawful. Customary and treaty rules of weapons law apply to these weapon technologies including the obligation for states to undertake weapon reviews. The Chapter summarises these customary and treaty rules and notes that reviewing autonomous weapon technologies will involve an assessment of whether the weapon system is capable of undertaking the decision-making that targeting law requires, and to which reference is made in the Chapter.

Cite

CITATION STYLE

APA

Boothby, W. (2017). Dehumanization: Is there a legal problem under article 36? In Dehumanization of Warfare: Legal Implications of New Weapon Technologies (pp. 21–52). Springer International Publishing. https://doi.org/10.1007/978-3-319-67266-3_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free