Irreducible Markov chain with µ as the stationary distribution.

132 Views Asked by At

Prove that if $\mu$ is a stationary distribution for an irreducible Markov chain on $S$, then $\mu(j) > 0 \ \forall j \in S$. (only using the fact that Markov chain is irreducible and that $\mu P = \mu$)

How can I do that? I have used this fact while proving other limit theorems but can't think about how to do it.

1

There are 1 best solutions below

0
On BEST ANSWER

An irreducible $n$ state time homogenous markov chain, has a single communicating class. The result comes from breaking this into 2 pieces.

(1.) Prove that with a single communicating class, each state $j$ must be reached with positive probability in at most $n$ transitions from any state $i$. The standard approach is graph theoretic, but the result is implied by Cayley Hamilton.

(2.) Suppose for a contradiction that $\mathbf \mu P = \mathbf \mu$ but there is some $\mu_j =0$. Then start the chain in steady state (i.e. a convex combination of the n states)
Application of (1.) tells us
$0\lt Pr\{\text{visiting state j over n transitions given start in steady state}\}$

This yields, with $A_k$ being the event of visiting state $j$ on the kth transition, given a start in steady state:

$0$
$\lt \Pr\{\text{visiting state j over n transitions given start in steady state}\}$
$= \Pr\big(A_1 \cup A_2 \cup... \cup A_n\big)$
$\leq \sum_{k=1}^{n}\Pr\big(A_k\big)$
$= \sum_{k=1}^{n} \mathbf \mu P^k\mathbf e_j$
$=\sum_{k=1}^{n} \mathbf \mu \mathbf e_j$
$=\sum_{k=1}^{n} 0$
$= 0$
by The Union Bound

alternatively
we could re-frame the above as a delayed renewal process where $N(t)$ counts the number of visits to state $j$, given a start in steady state (hence a delayed renewal process). Then the above reads, with $t:=n$

$0 \lt \Pr\{N(t) \geq 1\} \leq \frac{\mathbb E[N(t)]}{1} = \mathbb E[N(t)] = \sum_{k=1}^n \mathbb E[\mathbb 1_k]= \sum_{k=1}^{n} \mathbf \mu P^k\mathbf e_j = 0$
by Markov's Inequality