A Markov chain with 3 states: $S = \left \{1,2,3\right\}$ has the following transition matrix:
$P = \begin{pmatrix} 0.65 & 0.28 & 0.07\\ 0.15 & 0.67 & 0.18\\ 0.12 & 0.36 & 0.52 \end{pmatrix}$
Knowing that the initial state is $2$, what is the probability that after 3 transitions, we will be in state $3$ without going to state $1$ in these 3 steps?
In other words, I need to find the following probability: $P(X_3=3,X_2\neq1,X_1\neq1|X_0 = 2) = \frac{P(X_0=2,X_1\neq1,X_2\neq1,X_3=1)}{P(X_0=2)}$.
I don't know if I have to take cases, like when $X_1=2$ or $X_1=3$ and the same for $X_2$ or there is a quicker way to do it.
Let the matrix $P'$ be similar matrix as $P$ but this time the state 1 is terminal. (i.e. the first row is 1 0 0)
Your sollution is the number in the third column and the second row in the matrix $P'^3$.
A short explanation: If you don't want to go via the state 1, you can change the Markow chain in such way that no path leads via state 1. Setting the node 1 to a terminal works perfectly. After that, it is a straightforward calculation.