This article discusses an important limitation on the degree of autonomy that may permissibly be afforded to autonomous weapon systems (AWS) in the context of an armed conflict: the extent to which international humanitarian law (IHL) requires that human beings be able to intervene directly in the operation of weapon systems in the course of an attack. As there is currently no conventional or customary law directed specifically at AWS, limits on use of autonomous capabilities in weapons, if any exist, must be inferred from the principles, rules and goals of general IHL. The process adopted herein is to look for two broad types of limitations: those which take the form of maximum permissible degrees of machine involvement in regulated activities, and those which take the form of minimum permissible degrees of human involvement. The article's main finding is that while existing law does not impose limits of the first type, it does impose some of the second type. Specifically, legal obligations borne by individuals (commanders in charge of AWS operations, weapon system operators and others) determine the required minimum capacity for direct human intervention. The article further suggests means by which the required degree of human intervention may be determined in specific circumstances.
CITATION STYLE
Mcfarland, T. (2022). Minimum Levels of Human Intervention in Autonomous Attacks. Journal of Conflict and Security Law, 27(3), 387–409. https://doi.org/10.1093/jcsl/krac021
Mendeley helps you to discover research relevant for your work.