Stationary Measures For Markov Chains

720 Views Asked by At

I would like to know why in theory of Markov Chains we are always interested in to know (if it exits) the stationary measure (s) of the Markov Chains ?

Could you list some results that clarify the meaning of this concept to me?

1

There are 1 best solutions below

0
On

Stationary measures tell us the long run behavior of Markov chains. In probability theory, one is usually interested in answering questions of the kind, "when $n \rightarrow \infty$, what happens"?

Given an aperiodic and irreducible Markov chain $X$ with state space $S$ endowed with a stationary distribution $\mu$, for any $t$ which is sufficiently large, $\mathbb{P}(X_t = x) = \mu(x)$ for any $x \in S$. That is, the stationary distribution tells us that regardless of where we start in the Markov chain, if we run the chain long enough, its long run behavior is given by its stationary distribution. To be more precise, if a stationary distribution $\mu$ exists $$ \forall \, a, b \in S, \, \lim_{n \rightarrow \infty} \pi^n(a, b) = \mu(b), $$ where $\pi^n$ is the $n$-step transition kernel of the Markov chain.