understanding proof of $P_i(T_j<\infty) =1$ for recurrent states $i,j$ in irreducible Markov chain by Norris

79 Views Asked by At

I am currently working in Markov chain and stumbled across a proof in Norris.

We have an irreducible Markov chain with recurrent states in state space $I$. The statement is now that

$P_i(T_j < \infty) =1$ for all $i,j \in I$.

$P_i$ denotes the probabilty conditioned on $X_0=i$.

So let $i,j \in I$.

First we choose an $m$ with $p_{ji}^{(m)}>0$, which exists since we are irreducible.

Then the proof goes on with

$$ 1 \overset{(1)}{=} P_j(X_n=j \operatorname{for infinitely many} n) \overset{(2)}{\leq} P_j(X_n=j \operatorname{for some} n \geq m+1) \\ \overset{(3)}{=} \sum\limits_{k \in I} P_j(X_n = j \operatorname{for some} n \geq m+1 | X_m = k) P_j(X_m = k) \\ \overset{(4)}{=} \sum\limits_{k \in I} P_k(T_j<\infty) p_{jk}^{(m)} $$

Equality (1) is the definition of recurrence and inequality (2) follows by monotony of probability, since the event on the RHS of (2) is less restrictive than the one on the LHS. Equation (3) is the law of total probability I guess. Finally equality (4) follows from $P_j(X_m = k) = p_{jk}^{(m)}$ and $P_j(X_n = j \operatorname{for some} n \geq m+1 | X_m = k) = P_k(T_j<\infty) $ by definition.

Are these obervation right so far?

Now we observe that $\sum\limits_{k \in I} p_{jk}^{(m)} = 1$, which is true since the power of a stochastic matrix, here $P = (p_{ij})_{i,j\in I}$, is again a stochastic matrix and hence the rows sum up to $1$.

The proof finishes with "so we must have $P_i(T_j<\infty)=1$".

I don't get that last point. I think it is some obvious observation, otherwise there would be a more precise argument, but I just dont get it.

1

There are 1 best solutions below

0
On

Note that if for any $k \in I$ you have $P_k(T_j < \infty) <1$, then $$ 1 \leq \sum_k P_k(Tj< \infty)p_{jk}^{(m)} < \sum_k 1\cdot p_{jk}^{(m)}= 1 $$

Which is a contradiction. So you can conclude that the probability is 1.