Software is often governed by and thus adapts to phenomena that occur at runtime. Unlike traditional decision problems, where a decision-making model is determined for reasoning, the adaptation logic of such software is concerned with empirical data and is subject to practical constraints. We present an Iterative Decision-Making Scheme (IDMS) that infers both point and interval estimates for the undetermined transition probabilities in a Markov Decision Process (MDP) based on sampled data, and iteratively computes a confidently optimal scheduler from a given finite subset of schedulers. The most important feature of IDMS is the flexibility for adjusting the criterion of confident optimality and the sample size within the iteration, leading to a tradeoff between accuracy, data usage and computational overhead. We apply IDMS to an existing self-adaptation framework Rainbow and conduct a case study using a Rainbow system to demonstrate the flexibility of IDMS.
CITATION STYLE
Su, G., Chen, T., Feng, Y., Rosenblum, D. S., & Thiagarajan, P. S. (2016). An iterative decision-making scheme for markov decision processes and its application to self-adaptive systems. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9633, pp. 269–286). Springer Verlag. https://doi.org/10.1007/978-3-662-49665-7_16
Mendeley helps you to discover research relevant for your work.