A Markov decision process approach to dynamic power management in a cluster system

14Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Dynamic power management (DPM) plays a significant role to save power consumption effectively in both the design and operational phases of computer-based systems. It is well known that the state-dependent control policy by monitoring energy states in each component or the whole system is efficient for power saving in server systems whose system state, such as transaction request, can be completely observed. In this paper, we consider an optimal power-aware design in a cluster system and formulate the DPM problem by means of the Markov decision process. We derive the dynamic programming equation for the optimal control policy, which maximizes the expected reward per unit electrical power, which is called the power effectiveness, and give the policy iteration algorithm to determine the optimal control policy sequentially. In numerical experiments, we show the optimal control policy for an example of a cluster system with two service nodes, where the arrival stream of the transaction request is described as a Markov modulated Poisson process. In addition, based on the access data of an enterprise system, the optimal power-aware control for the cluster system and its effectiveness is examined.

Cite

CITATION STYLE

APA

Okamura, H., Miyata, S., & Dohi, T. (2015). A Markov decision process approach to dynamic power management in a cluster system. IEEE Access, 3, 3039–3047. https://doi.org/10.1109/ACCESS.2015.2508601

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free