Dynamic power management (DPM) plays a significant role to save power consumption effectively in both the design and operational phases of computer-based systems. It is well known that the state-dependent control policy by monitoring energy states in each component or the whole system is efficient for power saving in server systems whose system state, such as transaction request, can be completely observed. In this paper, we consider an optimal power-aware design in a cluster system and formulate the DPM problem by means of the Markov decision process. We derive the dynamic programming equation for the optimal control policy, which maximizes the expected reward per unit electrical power, which is called the power effectiveness, and give the policy iteration algorithm to determine the optimal control policy sequentially. In numerical experiments, we show the optimal control policy for an example of a cluster system with two service nodes, where the arrival stream of the transaction request is described as a Markov modulated Poisson process. In addition, based on the access data of an enterprise system, the optimal power-aware control for the cluster system and its effectiveness is examined.
CITATION STYLE
Okamura, H., Miyata, S., & Dohi, T. (2015). A Markov decision process approach to dynamic power management in a cluster system. IEEE Access, 3, 3039–3047. https://doi.org/10.1109/ACCESS.2015.2508601
Mendeley helps you to discover research relevant for your work.