Markov Assumption, Independence, and Conditional Probability Rules

70 Views Asked by At

In general, if $X,Y$ are conditionally independent on $Z$, then it follows that $$P(X,Y|Z) = P(X|Z)P(Y|Z)$$.

I am working on a problem where I use the Markov assumption

$$P(X_t|X_{t-1}, U_t)$$

Meaning that, the posterior density of $X$ at time $t$ is only dependent on the previous time-step $X_{t-1}$ and some current input $U_t$.

My question is, does this assumption allow me to use independence properties with any other variable I might encounter as long as I have sufficient information to make the assumption true? For example, if Z satisfies the markov assumption for X, could one state:

$$P(X,Y|Z,A) = P(X|Z)P(Y|Z,A)$$

I am trying to get at how powerful the assumption is... If I know one of my factored distributions is independent of all other information given that I have previous state and current input, can I combine the two distributions with different conditions as I did in the above example?

To clarify, lets say we are given the right hand side $P(X|Z)P(Y|Z,A)$ If Z = $X_{t-1}, U_t$ then I am asking if we know that these two distributions are independent and can thus express their product as $P(X,Y|Z,A)$