Influence of state-variable constraints on partially observable Monte Carlo planning

30Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Online planning methods for partially observable Markov decision processes (POMDPs) have recently gained much interest. In this paper, we propose the introduction of prior knowledge in the form of (probabilistic) relationships among discrete state-variables, for online planning based on the well-known POMCP algorithm. In particular, we propose the use of hard constraint networks and probabilistic Markov random fields to formalize state-variable constraints and we extend the POMCP algorithm to take advantage of these constraints. Results on a case study based on Rocksample show that the usage of this knowledge provides significant improvements to the performance of the algorithm. The extent of this improvement depends on the amount of knowledge encoded in the constraints and reaches the 50% of the average discounted return in the most favorable cases that we analyzed.

Cite

CITATION STYLE

APA

Castellini, A., Chalkiadakis, G., & Farinelli, A. (2019). Influence of state-variable constraints on partially observable Monte Carlo planning. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 5540–5546). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/769

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free