Markov chain knowing future

137 Views Asked by At

I was wondering whether or not P(X1 = S1 | X0 = S0) and P(X1 = S1 | X0 = S0 and X2 = S2) are the same? What I mean is can we get some information from the future states?

Thanks!

1

There are 1 best solutions below

0
On

Whenever you have issues thinking of conditional probabilities, it may be worth to first think about the corresponding joint probabilities. For example, in case of stochastic processes it is common to have an impression that the past of the process affects the future, a question about conditional distribution of the past given the future may raise confusion. In fact, probability makes these things perfectly symmetric: we are just talking about $P(X_0\in A_0,X_1\in A_1)$ and the only question is whether we divide it by $P(X_0\in A_0)$ or by $P(X_1\in A_1)$.

For example, if in the Markov Chain you can to state $1$ only from state $0$, but not from others, then given that $X_2 = 1$ you can almost certainly deduce that $X_1 = 0$ since in all other cases the joint distribution is going to be zero.