Argument Schemes and a Dialogue System for Explainable Planning

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Artificial Intelligence (AI) is being increasingly deployed in practical applications. However, there is a major concern whether AI systems will be trusted by humans. To establish trust in AI systems, there is a need for users to understand the reasoning behind their solutions. Therefore, systems should be able to explain and justify their output. Explainable AI Planning is a field that involves explaining the outputs, i.e., solution plans produced by AI planning systems to a user. The main goal of a plan explanation is to help humans understand reasoning behind the plans that are produced by the planners. In this article, we propose an argument scheme-based approach to provide explanations in the domain of AI planning. We present novel argument schemes to create arguments that explain a plan and its key elements and a set of critical questions that allow interaction between the arguments and enable the user to obtain further information regarding the key elements of the plan. Furthermore, we present a novel dialogue system using the argument schemes and critical questions for providing interactive dialectical explanations.

Cite

CITATION STYLE

APA

Mahesar, Q. A., & Parsons, S. (2023). Argument Schemes and a Dialogue System for Explainable Planning. ACM Transactions on Intelligent Systems and Technology, 14(5). https://doi.org/10.1145/3610301

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free