Summation of transition probability of Markov chain

37 Views Asked by At

From Ross Ch. 4 Ex. 45 (c):

Consider an irreducible finite Markov chain with states $0,1,\dots,N$.

Let $x_i=P\{$visit state $N$ before state $0$ | start in $i\}$. Then

$$x_i=\sum_{j=0}^{N}P_{ij}x_j,\qquad x_0=0,\qquad x_N=1$$

If $\sum_jjP_{ij}=i$ for $i=1,\dots,N-1$, show that $x_i=i/N$ is a solution to the above equations.

What does $\sum_jjP_{ij}=i$ even mean? I'm having trouble making any sense out of it. How can the summation of something be a state?