Prove that the probability of the Markov chain returning $n$ times given the initial state $i$ is $(\rho_{ii})^n$

30 Views Asked by At

Suppose the "ever-return probability to state i" is $\rho_{ii} = \sum_{k=1}^\infty \rho_{ii}^k$, where $\rho_{ii}^k$ stands for the probability to return to state $i$ starting from state $i$. Using the strong Markov property to argue that the probability of the Markov chain returning $n$ times given the initial state $i$ is $(\rho_{ii})^n$.

This intuitively makes sense to me, as given a stopping time $T$ (like the time of first return to state $i$, the future of the process is independent of the past, given the present. So after the chain returns to state $i$ for the first time, the probability that it returns again is still $\rho_{ii}$. Therefore, the probability of $n$ independent returns is the product of the probabilities of each return, which is $(\rho_{ii})^n$.

But I'm wondering if there's a way to do a formal proof? (maybe some hint on how to start? Thanks!)