If I have a random walk Markov chain whose transition probability matrix is given by
$$ \mathbf{P} = \matrix{~ & 0 & 1 & 2 & 3 \\ 0 & 1 & 0 & 0 & 0 \\ 1 & 0.3 & 0 & 0.7 & 0 \\ 2 & 0 & 0.3 & 0 & 0.7\\ 3 & 0 & 0 & 0 & 1 } $$
I'm supposed to start in state 1, and determine the probability that the process is absorbed into state 0. I'm supposed to do so using the basic first step approach of equations: \begin{align*} u_1&=P_{10} + P_{11}u_1 + P_{12}u_2\\ u_2&=P_{20} + P_{21}u_1 + P_{22}u_2 \end{align*}
I also should use the results for a random walk given by: $$u_i = \begin{cases} \dfrac{N-i}{N} & p=q=1/2\\\\ \dfrac{(q/p)^i-(q/p)^N}{1-(q/p)^N} & p\neq q \end{cases}$$
Can I have some suggestions on how to proceed? Thanks for any and all help!
The probability that we get to state zero immediately is $0.3$. The next possibility is that we get to state two then we get back to state one and then to state zero, the probability of which event is $0.7\cdot0,3\cdot0.3=0.7\cdot0.3^2$. The probability of the next possibility is $0.7\cdot0.3\cdot0.7\cdot0.3\cdot0.3=0.7^2\cdot0.3^3$, and so on.
The probability that we get to state zero once in the future is then $$\sum_{i=0}^{\infty} 0.7^{\ i}0.3^{\ i+1}=0.3\sum_{i=0}^{\infty} 0.21^{\ i}=0.3\frac{1}{1-0.21}.$$