Hospital inventory management through markov decision processes @runtime

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Stock management in a hospital requires the achievement of a trade-off between conflicting criteria, with mandatory requirements on the quality of patient care as well as on purchasing and logistics costs. We address daily drug ordering in a ward of an Italian public hospital, where patient admission/discharge and drug consumption during the sojourn are subject to uncertainty. To derive optimal control policies minimizing the overall purchasing and stocking cost while avoiding drug shortages, the problem is modeled as a Markov Decision Process (MDP), fitting the statistics of hospitalization time and drug consumption through a discrete phase-type (DPH) distribution or a Hidden Markov Model (HMM). A planning algorithm that operates at run-time iteratively synthesizes and solves the MDP over a finite horizon, applies the first action of the best policy found, and then moves the horizon forward by one day. Experiments show the convenience of the proposed approach with respect to baseline inventory management policies.

Cite

CITATION STYLE

APA

Biagi, M., Carnevali, L., Santoni, F., & Vicario, E. (2018). Hospital inventory management through markov decision processes @runtime. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11024 LNCS, pp. 87–103). Springer Verlag. https://doi.org/10.1007/978-3-319-99154-2_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free