Consider the Drunkard’s walk Markov chain with state space $\mathcal{X}= ${$0, 1, . . . , N$} and transition matrix:
$$P = \begin{bmatrix} &1 &0 &0 &0 &\cdots &0 &0 \\ &\beta &0 &\alpha &0 &\cdots &0 &0 \\ &0 &\beta &0 &\alpha &\cdots &0 &0 \\ &\vdots &\vdots &\vdots &\vdots &\cdots &\vdots &\vdots \\ &0 &0 &\cdots &\beta &0 &\alpha &0 \\ &0 &0 &\cdots &0 &\beta &0 &\alpha \\ &0 &0 &\cdots &0 &0 &0 &1 \end{bmatrix}$$
where $0 < α < 1$ is the probability of moving one step from position $k$ to position $k + 1$, and $β = 1 − α$ is the probability to move from position $k$ to position $k − 1$, for $k = 1, . . . , N − 1$.
Given an initial distribution $(0, . . . , 0, 1, 0, . . . , 0)$ with $1$ on the $j-th$ entry, let $p_j$ , for $j = 0, . . . , N$, be the probability that $X_n = N$ for some $n ≥ 0$ (the drunkard reaches home). Find a set of linear equations for the $p_j$ .
I do not know how I should start with this exercise. Intuitively I think that we will have to use the formula $p^{(n)} (y \mid x) = P^n(x, y)$, but I do not see how we should get equations for $p_j$ from that. Could you help me?
Basically you just need to read the matrix. Consider $p_0$. You have:
$$ p_0 = 0 $$
since the matrix says that if you start at state $0$ you stay there forever. What about $p_1$? Simply, you have:
$$ p_1 = \alpha p_2 $$ The probablity of reaching home from state 1 is $\alpha$ times the probability of reaching home from $2$. Then you have:
$$ p_2 = \beta p_1 + \alpha p_3 $$
I guess you can go on from there.