Show $\sum^n_{j=0} P_{ij} = \sum^n_{j=0} P(X_1 =j | X_0 = i) = 1 $

38 Views Asked by At

Let $P$ be the one step transition matrix of a Markov chain with states {$0,1,...,n$}. Show $\sum^n_{j=0} P_{ij} = \sum^n_{j=0} P(X_1 =j | X_0 = i) = 1 $

I understand that this is the row sum, but how do I prove it?

1

There are 1 best solutions below

2
On

After, state i, the particle must enter one of the possible states.

$$S_1=\cup_{j=0}^n X_1=j$$

Since, each $X_1=j$ is separate from the others:

$$\sum^n_{j=0} P_{ij} = \sum^n_{j=0} P(X_1 =j | X_0 = i) = P(\cup_{j=0}^n X_1=j | X_0 = i)=P(S_1| X_0 = i)= 1 $$

Therefore, it is a condition that is imposed on the transition matrix of the Markov Chain.