Markov chains1 and Markov decision processes (MDPs) are special cases of stochastic games. Markov chains describe the dynamics of the states of a stochastic game where each player has a single action in each state. Similarly, the dynamics of the states of a...
CITATION STYLE
Neyman, A. (2003). From Markov Chains to Stochastic Games. In Stochastic Games and Applications (pp. 9–25). Springer Netherlands. https://doi.org/10.1007/978-94-010-0189-2_2
Mendeley helps you to discover research relevant for your work.