In linear model, if we assume the observations are i.i.d., will the residuals be i.i.d. under OLS?

540 Views Asked by At

In the linear model, we usually assume random sampling. I know this assumption is important since the violation will cause bias in the estimator. But if this assumption holds, can we prove the residuals are still i.i.d. under OLS? Or any examples that this assumption can not guarantee the residuals are still i.i.d.?

1

There are 1 best solutions below

0
On BEST ANSWER

If $y_i = x_i^\top \beta + \epsilon_i$ where $\epsilon_i$ are i.i.d. with zero mean and variance $\sigma^2$, then the fitted value under OLS is $\hat{y} = Hy$ where $H = X(X^\top X)^{-1} X^\top$, and thus the residuals are $y - \hat{y} = (I-H)y$.

The covariance matrix of the residuals is $$\text{Cov}((I-H)y) = (I-H)\text{Cov}(y)(I-H)^\top = (I-H)(\sigma^2 I)(I-H)^\top =\sigma^2(I-H)$$ which is not a diagonal matrix (in general), so the residuals are (in general) not independent nor identically distributed.