Conditional independence and conditional expectation for joint Gaussian vectors.

145 Views Asked by At

I am reading Hajek's book "Random Processes for Engineers". In example 4.5 it says: Let $X,Y,Z$ be joint Gaussian vectors. Then $X$ and $Z$ are conditionally independent to each other given $Y$ if and only if $$\mathbb{E}[X|Y,Z] = \mathbb{E}[X|Y].$$ As it said, both $X|Y=y,Z=z$ and $X|Y=y$ are Gaussian. So, their distribution equal to each other if and only if their mean and covariances are the same for every $y$ and $z$. Based on this I can show that conditional independence imply the relation of conditional expectations. But how to show the other way around?

1

There are 1 best solutions below

1
On

Multiply both sides by $Z$ to get $ZE[X|Y,Z]=ZE[X|Y]$. On the left side you call pull $Z$ inside the conditional expectation so $E[ZX|Y,Z]=ZE[X|Y]$. Now take conditional expectation given $Y$ on both sides. You get $E[XZ|Y]=E[ZE[X|Y]|Y]$. On the right side pull out $E[X|Y]$ to get $E[XZ|Y]=E[X|Y]E[Z|Y]$. Hence covariance of $E[X|Y]$ and $E[Z|Y]$ is zero.