If I have a two-state Markov Chain, and I start a chain at state 1, and another at state 2, what is the expected time before they hit?

218 Views Asked by At

I have a two-state Markov Chain that looks like:

$$ P= \left(\begin{matrix} 0.4 &0.6 \\ 0.7 & 0.3 \end{matrix} \right). $$

From this, suppose I define $X_t$ and $Y_t$, where $X_t$ starts at state 1 above (corresponding to $0.4,0.6$), and $Y_t$ starts at state 2. I would like to find the expected time before they coincide for the first time. I have identified that my problem will be first defining $\tau = min\{n\geq 1:X_n \neq Y_n\}$. Then, I'd like to find $E[\tau] = E[min\{n\geq 1:X_n \neq Y_n\}]$. However, I am not sure even where to begin. I think that perhaps I am failing to see a trick. Does anyone know of any tricks?

1

There are 1 best solutions below

2
On BEST ANSWER

They will coincide(for the first time) on the step where one changes state and the other does not.   They will not do so on any step where both either do or donut change.   The probability measure of these events (at any step) does not care which entity is in which state, by symmetry, only that they are in different states.   Thus the count of steps until coincidence has a geometric$_1$ distribution.

$$N\sim\mathcal{Geo}_1(p) ~\iff~ \mathsf E(N) = \dfrac{1}{p}$$

Can you evaluate $p$ using your matrix?