POMDP based action planning and human error detection

13Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a Partially Observable Markov Decision Process (POMDP) model for action planning and human errors detection, during Activities of Daily Living (ADLs). This model is integrated into a sub-component of an assistive system designed for stroke survivors; it is called the Artificial Intelligent Planning System (AIPS). Its main goal is to monitor the user’s history of actions during a specific task, and to provide meaningful assistance when an error is detected in his/her sequence of actions. To do so, the AIPS must cope with the ambiguity in the outputs of the other system’s components. In this paper, we first give an overview of the global assistive system where the AIPS is implemented, and explain how it interacts with the user to guide him/her during tea-making. We then define the POMDP models and the Monte Carlo Algorithm used to learn how to retrieve optimal prompts, and detect human errors under uncertainty.

Cite

CITATION STYLE

APA

Jean-Baptiste, E. M. D., Rotshtein, P., & Russell, M. (2015). POMDP based action planning and human error detection. In IFIP Advances in Information and Communication Technology (Vol. 458, pp. 250–265). Springer New York LLC. https://doi.org/10.1007/978-3-319-23868-5_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free