This is a pretty basic question and I know the answer is probably really obvious, but I am having trouble reasoning as to why the following is true:
(From my lecture notes):
""" Expected time between consecutive visits: Let hii be the expected time between successive vists by an ergodic Markov chain to state i. Then,
$h_{ii} = 1/\pi_i$.
Since the Markov Chain spends $\pi_i$ fraction of time in state i, it must be that the expected time between successive visits to that state is $1/\pi_i$.
"""
So for example if $\pi_2$ = 1/3, then if the system has been running for a long time and someone from the outside looks in, the probablity of finding a particle in state 2 is 1/3. Then, the expected time between successive visits is 3. That last part is what doesn't make sense to me. Why is it that the expected number of visits to state i is 1/(probability of finding a particle in state i)? How can this be true for any n-state graph?
Any intuition would help me a lot!
Here’s one way to understand this: Think of “outside observer finds the system in state $i$” as a Bernoulli trial with probability $\mathbf\pi_i$ of success. It’s easy to show that the expected number of trials required to produce a success is $1/\mathbf\pi_i$.