Mean number of steps needed to reach a recurrent state in a finite irreducible Markov chain

329 Views Asked by At

Let $\mathbb{X}$ be a finite state space of an irreducible markov chain $\{X_n\}$ and let $T_x=\inf\{k\geq 0\mid X_k=x\}$ be the number of steps until $\{X_n\}$ reaches state $x\in \mathbb{X}$. True or false? $\mathbb{E}[T_x\mid X_0=x_0]<+\infty$ for all $x_0\in \mathbb{X}.$

Attempt. The irreducible Markov chain is finite and therefore recurrent, i.e. $P(T_x<+\infty\mid X_0=x_0)=1$ for all $x_0\in \mathbb{X}$. I believe that the statement made at the beginning is true and the fact that the state space is finite plays a crucial role in this.

For example, the symmetric random walk on the integers is recurrent, but the mean time needed to reach every state (different from where you start) is $+\infty.$

1

There are 1 best solutions below

1
On BEST ANSWER

Say there are $N$ states. Regardless of where you start, there is a chain of $N-1$ steps, with positive probability, that gets you to $x$. So there exists $p<1$ such that for every $x_0$, $$P(T_x\ge N|X_0=x_0)\le p.$$Hence $$P(T_x\ge 2N|X_0=x_0)\le p^2,$$etc. (In order to miss state $x$ after $2N$ steps you have to miss $x$ in the first $N$ steps, and then wherever you are you have to miss $x$ again in the next $N$ steps.)

So $$\sum_k kP(T_x\ge kN|X_0=x_0)\le \sum_k kp^k<\infty,$$which implies that $\Bbb E[T_x]<\infty$.