Computing probability for Markov chain

34 Views Asked by At

I'm given an infinite Markov chain with state space $\Omega=\{0,1,2,\dots\}$ with transition matrix $$P=\begin{pmatrix}p & 1-p & 0 & 0 & 0 & 0 &\dots\dots \\ p & 0 & 1-p & 0 & 0 &0 & \dots \dots \\ p & 0 &0 &1-p & 0 &0 &\dots\dots \\ p & 0 & 0 &0 &1-p & 0 &\dots\dots \\ \vdots & & \ddots & & \ddots & & \vdots\end{pmatrix} $$ That is $$p_{ij}= \begin{cases}p & for & i\in \Omega, j=0 \\ 1-p & if & j=i+1 & i,j\in \Omega \\ 0 & otherwise \end{cases} $$

I have computed the stationary distribution as $$\pi_k=p(1-p)^k $$ for $k\geq 0$.

Im asked to compute the probability to reach state $0$ after a "long time" assuming that we start at state $i\geq 3$. The answer is apparently $p(1-p)^2$, but I dont understand why? Could anyone help me out?

My own interpetation of the question is to compute $$\lim_{n \to \infty} P(X_{n+1}=0 | X_n \geq 3) $$ Which I cant manage to get to $p(1-p)^2$.

I appreciate help