In my probability class, right now we are dealing with Markov chains and I was stumbled by parts of this problem:
Given a $ \{ X_n \}_{n=0}^{\infty} $ be a homogeneous Markov chain (the transition probabilities are independent of the time) with a probability matrix P. For a specific state $ j $ we know $ P(X_0=j)=1 $ for which we define these times:
$ T_0 = 0 $
$ T_1 = \min\{n>0 | X_n = j \} $
$ T_2 = \min\{n>T_1 | X_n = j \} $
.
.
.
$ T_k = \min\{n>T_{k-1} | X_n = j \} $
And we define a new random process $ Y_n = T_n-T_{n-1} $
A. We are asked to give some explanation (not a proof) of the following $ P(Y_{n+1}=i) = P(X_{n+i}=j,X_{n+i-1} \neq j ... X_{n+1} \neq j | X_n=j) $ to be honest I have no real idea here how to use Markovian property or the homogeneous property of the chain.
B. We are asked to prove this formula for any $ i>1 $ we have $ P(X_i=a_i,...X_1=a_1 | X_0=j) = (\prod_{k=2}^{i} P(X_k=a_k|X_{k-1}=a_{k-1})) P(X_1=a_1|X_0=j) $. Solved thanks to Bayes' theorem.
C. We are to show that for all non-negative integer values of $ i $ we have this equality
$ P(X_{n+i}=j| X_{n+i-1} \neq j ... X_{n+1} \neq j , X_n=j) =\sum_{l \neq j} P(X_{n+i}=j| X_{n+i-1} = l \space , \space X_{n+i-2} \neq j \dots X_{n+1} \neq j \space , \space X_n=j) $. Again no idea how to do this or even relate it Markov properties.
D. We are asked to show using the previous parts and Markov property that this equality holds
$ P(X_{n+i}=j,X_{n+i-1} \neq j,...X_{n+1} \neq j | X_n=j) = P(X_{i}=j,X_{i-1} \neq j,...X_{1} \neq j | X_0=j) $
E. Using previous parts and Markov property we are asked to show $ P(Y_n = k) = P(T_1=k) $ as a hint we are asked to explain this intuitively which is obvious as time does not matter only transition thanks to the chain being homogeneous.
I am familiar with Markov chains but this seems to be a mystery to me as I understand the Markov property but not how to proceed on this problem so I would certainly appreciate all help.
A. This is pretty straight forward. Imagine $T_k$ as the $k$-th hitting time of the state $j,$ or the time when the Markov chain hits/reaches state $j$ for the $k$-th time. $Y_n$ is the time between $n-1$ and $n$-th hits. So $Y_n=i$ means all the $i-1$ states between the $n-1$-th and $n$-th hits did not hit the state $j,$ which implies all the states of $X$ in between were anything other than $j.$ hence the two events are equivalent
$$ Y_{n+1}=i \iff X_{n+i}=j,X_{n+i-1} \neq j ... X_{n+1} \neq j | X_n=j $$
C. Are you sure there is no typo? Looks like there should be a $P(X_{n+i-1}=l|X_{n+i-2}\neq j ... X_n=j)$ in RHS. The proof should involve Markov property and total probability
D. Basically due to Markov property, past is independent of future given the present. Hence when X hits the state $j$ for $n$-th time, the future distribution is independent of the past and hence its the same as hitting $j$ for the fist time when $n=0.$ This is also called a renewal process, as if the system resets/renews itself every time it hits the state $j.$ Use this intuition and use Markov property.
E. Again due to the renewal argument in D.
$$P(Y_n=k)=P(Y_0=k)=P(T_1-T_0=k)=P(T_1=k)$$
Edit 1: Elaborating part C, Let the event $R=\{X_{n+i-2}\neq j , ..., X_i=j\}$
\begin{eqnarray} && P(X_{n+i}=j|X_{n+i-1}\neq j , ..., X_i=j)\\ &=& P(X_{n+i}=j|X_{n+i-1}\neq j,R) \\ &=& \frac{P(X_{n+i}=j,X_{n+i-1}\neq j|R)}{P(X_{n+i-1}\neq j|R)} \\ &=& \sum_{l \neq j} \frac{P(X_{n+i}=j,X_{n+i-1}=l|R)}{P(X_{n+i-1}\neq j|R)} \\ &=& \sum_{l \neq j} P(X_{n+i}=j|X_{n+i-1}=l,R)\frac{P(X_{n+i-1}=l|R)}{P(X_{n+i-1}\neq j|R)} \\ &=& \sum_{l \neq j} P(X_{n+i}=j|X_{n+i-1}=l,R)P(X_{n+i-1}=l|X_{n+i-1}\neq j,R)\\ &=& \sum_{l \neq j} P(X_{n+i}=j|X_{n+i-1}=l,R)P(X_{n+i-1}=l|X_{n+i-2}\neq j,...,X_n=j) \end{eqnarray}
Note that the last step uses the Markov property.