A formula for an expected value

94 Views Asked by At

We have a Markov chain with $X_0 = z$, the return time $\tau_z$ of the first time at which we return to $z$, and some other state $y$. A proof I'm reading states:

$$\operatorname{E}(\text{number of visits to $y$ before returning to $z$}) = \sum^\infty_{t=0}P(X_t = y,\ \tau_z > t)$$

I have no idea where this comes from. The only thing I can think of is some complicated approach using inclusion-exclusion to reduce the right hand side to $E(X)=\sum P(X \geq n)$, but I suspect I'm just missing something obvious since the book just drops the above formula without saying a word.

1

There are 1 best solutions below

2
On BEST ANSWER

It's just linearity of expectation – the expected number of visits to $y$ before returning to $z$ is the sum over all times of the expected numbers of such visits at that time, which are the probabilities of the state being at $y$ at that time and not yet having reached $z$.