Correlation of errors in linear regression

50 Views Asked by At

How does correlated errors result in underestimating true standard errors in linear regression?

1

There are 1 best solutions below

0
On

Assuming that data generating process here is $Y = X \beta + \varepsilon$,

$$ \begin{array}{rl}Var(\hat\beta) &= Var((X'X)^{-1}X'Y) = Var((X'X)^{-1}X'[X\beta + \varepsilon]) \\ &= Var(\beta)+ Var((X'X)^{-1}X'\varepsilon) \\ &=(X'X)^{-1}X\varepsilon \varepsilon'X'(X'X)^{-1} \\ &\neq (X'X)^{-1}\sigma ^2 \end{array}$$

If the errors are correlated, $\Sigma = \varepsilon \varepsilon ' \neq \sigma^2 I$. As the $t$ and $F$ statistics depend on the classical OLS variance $(X'X)^{-1}\sigma ^2$, they no longer have the same desired distributions under the null hypothesis in this case.