I have Markov chain with states $S=\{1,2,3,4,5\}$ and probability matrix
$P= \begin{bmatrix} 0.2 & 0.8 & 0 & 0&0\\ 0 & 0.4 & 0.6&0&0 \\ 0 & 0 & 0.6&0.4&0 \\ 0.2&0&0&0.6&0.2 \\ 0&0&0&0&1 \end{bmatrix} $
If the chain starts from state 1, I need to find probability of entering state 4 exactly 4 times.
I have one idea how can I do that - introducing indicator which counts every time chain enters state 4. Sum of the indicator should be 4. Am I overthinking it? Where should I start?
The transition $1 \to 2 \to 3 \to 4$ happens with probability $1$. Therefore you can consider the chain $$\begin{pmatrix} .8 & .2 \\ 0 & 1 \end{pmatrix}$$ and find the probability that, starting from state $1$, the transition $1 \to 1$ occurs exactly three times.