From Markov Chains to Stochastic Games

  • Neyman A
N/ACitations
Citations of this article
24Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Markov chains1 and Markov decision processes (MDPs) are special cases of stochastic games. Markov chains describe the dynamics of the states of a stochastic game where each player has a single action in each state. Similarly, the dynamics of the states of a...

Cite

CITATION STYLE

APA

Neyman, A. (2003). From Markov Chains to Stochastic Games. In Stochastic Games and Applications (pp. 9–25). Springer Netherlands. https://doi.org/10.1007/978-94-010-0189-2_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free