difference in conditionnal probabilities in markov chain

30 Views Asked by At

I am studying Markov chain and i want to know the expression of $P(X_{n+1} =j)$ but i don't understand the difference between :

$\sum_{i} P(X_{n+1} =j | X_{n} = i). P( X_{n} = i)$

and $\sum_{i} P(X_{n+1} =j | X_{n} = i)$

I understand the first expression is the sum of probabilities $P(X_{n+1} =j, X_{n} = i)$ which means that both of events are realized.

But intuitively the second expression seems to be the same, since it is the sum for all i of probabilities of , $X_{n+1} =j$, given the event $X_{n} = i$.

1

There are 1 best solutions below

1
On

Consider a series of coin flips as a markov chain. You have two states, H and T, and the conditional probabilities $$P(X_n=H)=P(X_n=T)=\frac{1}{2}\\P(X_{n+1}=T|X_n=H)=P(X_{n+1}=T|X_n=T)=\frac{1}{2}$$ The first expression in your questions evaluates to $\frac{1}{2}\times\frac{1}{2}+\frac{1}{2}\times\frac{1}{2}=\frac{1}{2}$, which is correct.

The second expression evaluates to $\frac{1}{2}+\frac{1}{2}=1$, which is clearly not correct.