How do know if a Markov chain converges?

370 Views Asked by At

Suppose that $X_n$ is a Markov chain on the states $\{0,1,2,3,4\}$ with transition matrix: $$\begin{bmatrix} 0 & 0 & 0 & 0 & 1 \\ 2/3 & 0 & 0 & 1/3 & 0 \\ 1/5 & 1/5 & 1/5 & 1/5 & 1/5 \\ 0 & 0 & 1/2 & 0 & 1/2 \\ 1 & 0 & 0 & 0 & 0 \end{bmatrix}$$

I have been asked to determine whether $X_n$ converges almost surely or in $L^1$? How do I know whether these statements are true?

When I think of convergence of a Markov chain, I think of the stationary distribution (if it's ergodic) but I'm not sure if this is right.. could someone please provide some hints?

1

There are 1 best solutions below

2
On

Let $\tau=\inf\{n>0: X_n\in\{0,4\}\}$. Because $P_{04}=P_{40}$ and $\{1,2,3\}$ are transient, it is clear that $$\mathbb P\left(\bigcap_{n=\tau}^\infty \{X_n\in\{0,4\}\}\right) =1. $$ Moreover, if $N$ is a positive integer such that $X_N=0$, then $X_{N+2k}=0$ for $k=1,2,3,\ldots$ with probability $1$ and similarly if $M$ is a positive integer such that $X_M=4$, then $X_{M+2k}=4$ for $k=1,2,3,\ldots$ with probability $1$. It follows that $$ \mathbb P\left(\liminf_{n\to\infty} \left\{X_n = 0\right\} \right) $$ while $$ \mathbb P\left(\limsup_{n\to\infty} \left\{X_n = 4\right\} \right), $$ so $X_n$ does not converge almost surely. By a similar argument it is clear that $X_n$ cannot converge in $L_1$.