Markov Chains: Basic Definitions

6Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Heuristically, a discrete-time stochastic process has the Markov property if the past and future are independent given the present. In this introductory chapter, we give the formal definition of a Markov chain and of the main objects related to this type of stochastic process and establish basic results. In particular, we will introduce in Section 1.2 the essential notion of a Markov kernel, which gives the distribution of the next state given the current state. In Section 1.3, we will restrict attention to time-homogeneous Markov chains and establish that a fundamental consequence of the Markov property is that the entire distribution of a Markov chain is characterized by the distribution of its initial state and a Markov kernel. In Section 1.4, we will introduce the notion of invariant measures, which play a key role in the study of the long-term behavior of a Markov chain. Finally, in Sections 1.5 and 1.6, which can be skipped on a first reading, we will introduce the notion of reversibility, which is very convenient and is satisfied by many Markov chains, and some further properties of kernels seen as operators and certain spaces of functions.

Cite

CITATION STYLE

APA

Douc, R., Moulines, E., Priouret, P., & Soulier, P. (2018). Markov Chains: Basic Definitions. In Springer Series in Operations Research and Financial Engineering (pp. 3–25). Springer Nature. https://doi.org/10.1007/978-3-319-97704-1_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free