Let $\;M=\begin{pmatrix} 0.25 & 0.5 & 0.25\\ 0.5 & 0.25 & 0.25\\ 0.5 & 0.25 & 0.25 \end{pmatrix}\;$ be the transition matrix of a markov chain with states $S=\left\{0,1,2\right\}$. Calculate the expected value for the amount till state $1$ is reached, if we start from state $2$.
I've created this task myself and I hope it is clear because I couldn't find a real life example or something like that :)
The thing is I write an exam soon and I'm not quite sure if my solution is correct like that?
If we start from state $0$, we will reach state $0$ with a probability of $0.25$, state $1$ we reach with probability $0.5$ and state $2$ with probability $0.25$. Thus we have
$$k(0)=1+0.25k(0)+0.5k(1)+0.25k(2)$$
Same procedure for the other two states:
$$k(1)=1+0.5k(0)+0.25k(1)+0.25k(2)$$
$$k(2)=1+0.5k(0)+0.25k(1)+0.25k(2)$$
Now, all we need to do is to solve this linear system and the result of $k(2)$ is the expected value we were looking for?
That is almost correct, except that your linear system does not know which state you want to hit. To "tell" it that, you should set $k(1)=0$.
Remarks: