Compute the probability that the Markov chain $(X_n)_{n≥0}$ starting from state 1 will eventually reach state 3.

76 Views Asked by At

Problem:

Let $(X_n)_{n\geq 0}$ be a Markov Chain on the state space $S=\{0,1,2,3,4\}$ with transition probability matrix given by:

$$P=\begin{bmatrix} 1/2 & 1/2 & 0 & 0 & 0 \\ 0 & 1/2 & 1/2 & 0 & 0 \\ 0 & 0 & 1/3 & 2/3 & 0 \\ 0 & 0 & 0 & 1/2 & 1/2 \\ 0 & 0 & 0 & 1 & 0 \\ \end{bmatrix}$$

Compute the probability that the Markov chain $(X_n)_{n≥0}$ starting from state 1 will eventually reach state 3.

Attempt:

Well, i guess there are multiple ways to get to state 3 when starting from state 1, depending on the way one go. But it's the 'eventually' that really confuses me...

Any hints?