Conditional expectation $E(Y|X,W)$

47 Views Asked by At

Suppose $Z = (X,Y)^T$ is random vector, such that expectation of $Z$ conditional on $W$ is multivariate normal with parametres: $$ \mu = \left(\begin{matrix}\mu_0\\ \nu_0\end{matrix}\right) $$ $$ \Sigma=\left(\begin{matrix}\Sigma_{1,1}&\Sigma_{1,2}\\\Sigma_{2,1}&\Sigma_{2,2}\end{matrix}\right)$$ Then show that enpectation of $Y$ conditional on $X$, $W$ is multivariate normal with parametres: $$ E(Y|X,W) = E(Y|W) + \Sigma_{2,1}\Sigma_{1,1}^{-1}(X - E(X|W)) $$ $$ var(Y|X,W) = \Sigma_{2,2} - \Sigma_{2,1} \Sigma_{1,1}^{-1} \Sigma_{1,2} $$

I get this:

Assume that $\Sigma$ is invertible. First we find a matrix $C$ of constants such that $V := Y - CX$ is uncorrelated with $X$. For this to be true we demand $$ 0= \operatorname{cov} (V, X)=\operatorname{cov} (Y - CX, X)=\Sigma_{2,1}-C\Sigma_{1,1}, $$ which yields $$ C=\Sigma_{2,1}\Sigma_{1,1}^{-1}. $$ Therefore $$ \begin{align} E(Y|X,W)&=E(V + C X|X,W)\\ &=E(V|W) + CX\\ &=E(Y|W) + C(X - E(X|W))\\ &=E(Y|W) + \Sigma_{2,1}\Sigma_{1,1}^{-1}(X - E(X|W)). \end{align} $$ How can I prove for variance?

Tnx for help.