Mean time spent in a Markov chain

59 Views Asked by At

I've been trying to get my head around the following question concerning (continuous) Markov Chains. Suppose we have a continuous Markov Chain $X$ on a discrete space $S$, and $A \subset S$. Let $T_A$ be the hitting time for $A$. We define $h(x)$ as $P_x(T_A < \infty)$ and call $\tau_y$ the total time spent in $y$ before hitting $A$. I want to prove that $$E_x(\tau_y\, |\, T_A<\infty) = \frac{h(y)}{h(x)}E_x(\tau_y)$$

I've tried various ways of solving this, such as conditioning on $T_A$ in the finite and infinite cases, conditioning on the event of $X$ eventually hitting $y$ and other attempts, but to no avail. I feel that this problem should be fairly elementary, but I have not been able to make any real progress with it.