The correlation between the Residuals and the prediction $Cov(e,\hat{Y}) =0 $

283 Views Asked by At

assume a linear regression model: $y_i$ = $\beta_0$ + $\beta_1x_{i1}$..... + $\beta_px_{ip}$+ $\epsilon_i$

I'm asked to prove that:

$Cov(e,\hat{Y})$ = $0$

where: $e$ = the residuals vector

$\hat{Y}$ = the predicted vector of Y

Hint: use the fact that $X^Te$ = $0$ (I already proved this fact)

1

There are 1 best solutions below

0
On BEST ANSWER

You can write your estimator: $$\hat{Y} = X\hat{\beta}$$ Therefore, and by rules of Cov, you can take the constant matrix out (right side with transpose): $$Cov(e, \hat{Y}) = Cov(e, X\hat{\beta}) = Cov(e, \hat{\beta})X^T$$ Now you can proove that: $$Cov(e, \hat{\beta}) = 0$$

Since: $$Cov(e, \hat{\beta}) = Cov(Y - \hat{Y}, \hat{\beta})= Cov(Y, \hat{\beta})-Cov(\hat{Y}, \hat{\beta})$$

  1. $$Cov(Y, \hat{\beta}) = Cov(Y, (X^TX)^{-1}X^TY) = Cov(Y,Y)\times((X^TX)^{-1}X^T)^T = \sigma^2IX(X^TX)^{-1} = \sigma^2X(X^TX)^{-1}$$

  2. $$Cov(\hat{Y}, \hat{\beta}) = Cov(X\hat{\beta}, \hat{\beta}) = X[Cov(\hat{\beta},\hat{\beta})] = X[\sigma^2(X^TX)^{-1}] = \sigma^2X(X^TX)^{-1}$$

Therefore: $$Cov(Y, \hat{\beta})-Cov(\hat{Y}, \hat{\beta}) = \sigma^2X(X^TX)^{-1}-\sigma^2X(X^TX)^{-1}=0$$

Both equations rely on knowledge about the distribution of $\hat{\beta}\sim N\big(\beta, \sigma^2(X^TX)^{-1}\big)$ and that $\hat{Y} = X\hat{\beta} = X(X^TX)^{-1}X^TY$