Best Linear Prediction and variance decomposition

359 Views Asked by At

Let $( X , Y )$ have moments of at least the second order ,and let $\hat{Y} = a +bX$ then we will choose the coefficients $a$ and $b$ such that $a=E(Y)-\sigma_{XY}/\sigma_{X}^2E(X)$ and $b=\sigma_{XY}/\sigma_{X}^2$ , so that $\hat{Y}$ is the best linear prediction of $Y$ , where best is taken to mean “minimum expected squared distance between outcomes of $Y$ and outcomes of $\hat{Y}$".

Now define $V= Y - \hat{Y}$ to represent the deviations between outcomes of $Y$ and outcomes of the best linear prediction of $Y$ .

How can I prove that $\mathbb E((Y-\mathbb E(Y))^2) =\sigma_{Y}^2\rho_{XY}^2 + \sigma_{Y}^2 (1-\rho _{XY}^2).$ ?

My attempt is :

If $\hat{Y}$ is the best prediction of $Y$ then $E(Y)=E(\hat{Y})$ that means $E(V)=0$ Then $E((Y-E(Y))^2)=E((\hat{Y}-E(\hat{Y})+V)^2)= \sigma_{\hat{Y}}^2 + \sigma_{V}^2 +\sigma_{\hat{Y}V}$ .

And $\sigma_{V}^2 =E(Y-\hat{Y})^2 =E((Y-(a+bX))^2)=E((Y-(E(Y)-\sigma_{XY}/\sigma_{X}^2E(X)+\sigma_{XY}/\sigma_{X}^2 X))^2) $

I could not do anything more

How can I prove that : $$\sigma_{\hat{Y}}^2 + \sigma_{V}^2 +\sigma_{\hat{Y}V}=\sigma_{Y}^2\rho_{XY}^2 + \sigma_{Y}^2 (1-\rho _{XY}^2)$$