I wonder what is $Cov(y,\hat{y})$ the Covariance between estimated $\hat{y}$ and observed value $y$, in the situation of simple linear regression. Some Youtube tutor said it's 'O', but i can't agree with that.
You know, the linear regression model is like this. $$y_i = \hat{\beta_0} + \hat{\beta_1}x_i + e_i $$ $$\hat{y_i}=\hat{\beta_0} + \hat{\beta_1}x_i$$
How I approached is like this.
For a certain number 'i', $$Cov(y_i,\hat{y_i})$$ $$=Cov(\hat{y_i}+e_i, \hat{y_i})$$ $$=Cov(\hat{y_i}, \hat{y_i}) + Cov(e_i, \hat{y_i})$$ $$=Var(\hat{y_i}) + 0$$ so $Cov(y_i,\hat{y_i})$ becomes $Var(\hat{y_i})$. But with this, I can't make an important derivation about the variance of $e$. And I think I'm wrong, but I can't find the clue that my derivation is wrong. Thank you for help
**All derivation of The variance of the prediction error **
$$e_i = y_i - \hat{y_i}$$ $$var(y_i - \hat{y_i}) = var(v_i) + var(\hat{y_i}) - Cov(v_i, var(\hat{y_i})) $$ here, we all know what $var(v_i)$ and $var(\hat{y_i})$ is. And the derivation of $Cov(v_i, var(\hat{y_i}))$ is above.