Dynamic power management based on continuous-time Markov decision processes

198Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper introduces a continuous-time, controllable Markov process model of a power-managed system. The system model is composed of the corresponding stochastic models of the service queue and the service provider. The system environment is modeled by a stochastic service request process. The problem of dynamic power management in such a system is formulated as a policy optimization problem and solved using an efficient `policy iteration' algorithm. Compared to previous work on dynamic power management, our formulation allows better modeling of the various system components, the power-managed system as a whole, and its environment. In addition it captures dependencies between the service queue and service provider status. Finally, the resulting power management policy is asynchronous, hence it is more power-efficient and more useful in practice. Experimental results demonstrate the effectiveness of our policy optimization algorithm compared to a number of heuristic (time-out and N-policy) algorithms.

Cite

CITATION STYLE

APA

Qiu, Q., & Pedram, M. (1999). Dynamic power management based on continuous-time Markov decision processes. In Proceedings - Design Automation Conference (pp. 555–561). IEEE. https://doi.org/10.1145/309847.309997

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free