The responsibility gap: Ascribing responsibility for the actions of learning automata

  • Matthias A
  • 62

    Readers

    Mendeley users who have this article in their library.
  • 94

    Citations

    Citations of this article.

Abstract

Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription.

Author-supplied keywords

  • Artificial intelligence
  • Autonomous robots
  • Learning machines
  • Liability
  • Moral responsibility

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Andreas Matthias

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free