While remote attack, whether using remote piloting, autonomous attack technology or cyber techniques, does not per se raise legal issues, there is a clear ethical dimension. People are nevertheless closely involved, fulfilling various critical roles. All forms of mechanical learning associated with attack technologies are not unacceptable. Consider, for example, learning methods integrated into a weapon system that are designed to increase or ensure victim protection. Ethical concerns will, however, persist and will be associated with concerns that machines should not be permitted to decide who is to be attacked and who is to be spared. Zero casualty warfare is not as such unlawful. Customary and treaty rules of weapons law apply to these weapon technologies including the obligation for states to undertake weapon reviews. The Chapter summarises these customary and treaty rules and notes that reviewing autonomous weapon technologies will involve an assessment of whether the weapon system is capable of undertaking the decision-making that targeting law requires, and to which reference is made in the Chapter.
CITATION STYLE
Boothby, W. (2017). Dehumanization: Is there a legal problem under article 36? In Dehumanization of Warfare: Legal Implications of New Weapon Technologies (pp. 21–52). Springer International Publishing. https://doi.org/10.1007/978-3-319-67266-3_3
Mendeley helps you to discover research relevant for your work.