AGI Control Theory

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

According to forecasts, the invention of General Artificial Intelligence (AGI) will change the trajectory of the development of human civilization. To take advantage of this powerful technology and avoid its pitfalls, it is important to be able to control it. However, the ability to control AGI and its more advanced version “Superintelligence” has not been established. In this article, we explore the arguments that advanced AI cannot be completely controlled. The implications of uncontrolled AI are discussed in relation to the future of humanity and AI research, and the safety and security of AI systems.

Cite

CITATION STYLE

APA

Yampolskiy, R. V. (2022). AGI Control Theory. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13154 LNAI, pp. 316–326). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-93758-4_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free