Calculating hitting probabilities of a Markov chain

336 Views Asked by At

Consider a Markov chain with state space $$S = \{1, 2, \dots, 4, 5\}$$ and transition matrix $$P = \begin{pmatrix} 0 & 0 & \frac 1 2 & \frac 1 2 & 0\\ \frac 1 2 & 0 & \frac 1 2 & 0 & 0\\ \frac 1 2 & 0 & 0 & \frac 1 2 & 0\\ 0 & \frac 1 2 & 0 & 0 & \frac 1 2\\ 0 & 0 & 0 & 0 & 1 \end{pmatrix}.$$ Suppose $X_0 = 3$. What is the probability that the chain hits state $5$ before it hits state $1$?

My working

Let $U_i$ be the probability that the chain, starting in state $i$, hits state $5$ before it hits state $1$ and we have the following system of linear equations: $$U_2 = \frac 1 2 U_3,$$ $$U_3 = \frac 1 2 U_4$$ and $$U_4 = \frac 1 2 U_2 + \frac 1 2.$$ Solving this, we have $$U_3 = \frac 2 7,$$ which is also, by definition, the probability that the chain starts in state $3$ and hits state $5$ before hitting state $1$.


As I am just beginning to learn about Markov chains, this is my first time encountering such a problem and I would like to know if my working and answer are correct. Otherwise, do kindly point out where I have gone wrong and why :)