The number of hits of a Markov chain before hitting a set

66 Views Asked by At

Considera a Markov chain on a finite graph. Let $\mathrm{Z}$ be a set of vertices, $a$ a vertex not in $\mathrm{Z}$ and denote $N$ the numbers of times the Markov chain, started at $a$ visits $a$ again before entering $\mathrm{Z}.$ So, we set $X_0 = a$ and consider $N = \sum\limits_{n = 1}^\infty \mathbf{1}(X_n = a, \tau_\mathrm{Z} > n)$ (here $\tau_\mathrm{Z}$ is the firts time the Markov chain is in $\mathrm{Z}$).

Claim: the expected value of $N$ equals $\dfrac{1}{p}$ where $p$ is the probability of hitting $\mathrm{Z}$ before returning to $a.$

The claim is made in a book I am reading and tried to proved it formally, no success. I am also a bit puzzled as to why it is intuitively true. Any help appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

$N$ is just geometric distributed: you start at $a$, if you hit $Z$ before returning to $a$ then you have $0$ "successes". If you hit $a$ then you have $1$ successes and you now give it another try, but by the Markov property, because you have returned to $a$, the situation is exactly the same as when you started.