No pizza for you: Value-based plan selection in BDI agents

58Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

Abstract

Autonomous agents are increasingly required to be able to make moral decisions. In these situations, the agent should be able to reason about the ethical bases of the decision and explain its decision in terms of the moral values involved. This is of special importance when the agent is interacting with a user and should understand the value priorities of the user in order to provide adequate support. This paper presents a model of agent behavior that takes into account user preferences and moral values.

Cite

CITATION STYLE

APA

Cranefield, S., Winikoff, M., Dignum, V., & Dignum, F. (2017). No pizza for you: Value-based plan selection in BDI agents. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 178–184). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free