Prove the Markovness from independent variables

44 Views Asked by At

Let $\mathbf{w}$ and $\mathbf{v}$ be two independent random variables. Consider the following equations:

\begin{align} \mathbf{x} &= f(\mathbf{y}, \mathbf{w}),\\ \mathbf{z} &= g(\mathbf{y}, \mathbf{v}). \end{align}

Then, how to prove $p(x|y,z) = p(x|y)$? Thanks!

2

There are 2 best solutions below

7
On

The key here is that $\sigma(y,z) = \sigma(y, g(y,v)) = \sigma(y,v)$ (assuming that both $g$ and $f$ are measurable). For $h$ a test function, we then have $$\mathbb{E}[h(x)|y,z] = \mathbb{E}[h(f) (y,w) | y,z] = \mathbb{E}[h(f) (y,w) |y, v] = \mathbb{E} [h(f) (y,w) | y].$$ I assumed that you can prove something like $$ \mathbb{E}[h(f)(y,w) \mathbf{1}_A(y) \mathbf{1}_B(v)] = \mathbb{E}[h(f)(y,w) \mathbf{1}_A(y)] $$ for any sets $A$ and $B$ which will give you the last equality of my previous equations. This should be ok if you have a clue about the joint law of $y$ and $v$ (or of $z$ and $y$).

3
On

I think the appropriate keywords are the d-separation and the Bayes ball algorithm (relevant link). It is an efficient algorithm to verify the conditional independence.