Towards Autonomous Developmental Artificial Intelligence: Case Study for Explainable AI

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

State-of-the-art autonomous AI algorithms such as reinforcement learning and deep learning techniques suffer from high computational complexity, poor explainability ability, and a limited capacity for incremental adaptive learning. In response to these challenges, this paper highlights the TMGWR-based algorithm, developed by the present authors, as a case study towards self-adaptive unsupervised learning in autonomous developmental AI, and makes the following contributions: it presents and reviews essential requirements for today’s autonomous AI and includes analysis for their potential for Green AI; it demonstrates that, unlike these state-of-the-art algorithms, TMGWR possesses explainability potentials that can be further developed and exploited for autonomous learning applications. In addition to shaping researchers’ choice of metrics for selecting autonomous learning strategies, this paper will help to motivate further innovative research in autonomous AI.

Cite

CITATION STYLE

APA

Starkey, A., & Ezenkwu, C. P. (2023). Towards Autonomous Developmental Artificial Intelligence: Case Study for Explainable AI. In IFIP Advances in Information and Communication Technology (Vol. 676 IFIP, pp. 94–105). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-34107-6_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free