Prove that the Markov chain is irreducible and recurrent

299 Views Asked by At

Show that if a Markov chain $(X_{n})_{n}$ with transition matrix $P$ admits a $P-$invariant probability $\lambda$ (i.e. $\lambda=\lambda \cdot P$), then the Markov chain is irreducible and recurrent.

I am trying to show that: $U(x,x)=+\infty,$ $\forall x \in E$ s.t. $\lambda(x)>0$, where $U(x,x)=\sum_{n \geq 1}P_{n}(x,x)=\sum_{n \geq 1}\mathbb{P}(X_{n}=x|X_{0}=x)$. For this, I think I need to use that $U(y,x) \leq U(x,x)$, $\forall y \in E$, but I don't know how to proceed.

I wonder if maybe the statement is only true if the state space $E$ is finite...

1

There are 1 best solutions below

1
On

Not true even when the state space is finite. Let $\{X_n\}$ have state space $\{0,1\}$ and suppose both $0$ and $1$ are absorbing states. The transition matrix is the identity matrix. Then $(\frac 1 2,\frac 1 2)$ is an invariant distribution but the chain is not irreducible.