First time two independent Markov chains reach same state

489 Views Asked by At

Let $\{X_n\}$ and $\{Y_n\}$ two independent Markov chains with the same state space $\{0,1,2\}$ and same transition probability matrix: $$ P=\begin{pmatrix} 0.5&0.25&0.25\\ 0.25&0.25&0.5\\ 0&0.5&0.5 \end{pmatrix} $$

Define $T=\inf\{n: X_n=Y_n\}$. Assuming $X_0=0$ and $Y_0=2$, find $\mathbb{E}[T]$ and $\mathbb{P}[X_T=2]$.

I think I know how to compute the probability, but I'm having trouble computing the expectation of $T$.

My attempt: I thought it would be easy if I consider $$\mathbb{E}[T] = \sum_{i=0}^{2} \mathbb{E}[T\vert X_T = i]\cdot \mathbb{P}[X_T = i]$$

since the probability I need to compute in the first exercise is involved. So the only thing is left to compute is $\mathbb{E}[T\vert X_T = i]$ for $i=0,1,2$. I know how to solve this expected value if there's only one Markov chain involved, let's say if we want to know the first time $X_t = 2$ since $X_0 = 0$, using a recurrence argument, but I'm stuck when the problem involves two Markov chains.

On the other hand, I found this other question solving the analogous expected value for only one Markov chain, in a more direct way, without using conditional expected value, but I'm having trouble trying to apply a similar argument in this case.

Any help would be appreciatted.

1

There are 1 best solutions below

1
On BEST ANSWER

Hint: $ET=\sum P(T>n)=\sum P(X_i \neq Y_i: 1\leq i \leq n)$.