This chapter explores the moral implications of autonomous robotic weapons. This is done by answering several key questions. Firstly, in what sense are such weapons really autonomous? It is argued that this is not the case. Secondly, do such weapons necessarily compromise the moral responsibility of their human designers, computer programmers and/or operators and, if so, in what manner and to what extent? It is argued that it is not necessarily the case, at least if such weapons have human in or on the loop. Finally, should certain forms of autonomous weapons be prohibited? It is argued that human out of the loop weapons should be prohibited.
CITATION STYLE
Miller, S. (2018). Autonomous Weapons: Terminator-Esque Software Design. In Advanced Sciences and Technologies for Security Applications (pp. 157–169). Springer. https://doi.org/10.1007/978-3-319-74107-9_12
Mendeley helps you to discover research relevant for your work.