Is there a least squares estimator for correlated, non-constant variance errors?

64 Views Asked by At

For OLS we have $\hat{\beta} = (X^TX)^{-1}X^Ty$,

For non-constant variance we have $\hat{\beta} = (X^TWX)^{-1}X^TWy$,

but what if we have, for example $Y = X\beta + \epsilon $ where $\epsilon \sim N(0, aK +bI_{n})$ where k is a covariance matrix and $a,b$ are positive constants?

I tried spectral decomposing K to simplify a bit,

$aK + bI_{n} = U(\frac{a}{a+b}Λ + (1 − \frac{a}{a+b})I)U^{T}$

but now I'm unsure how to proceed.

2

There are 2 best solutions below

0
On BEST ANSWER

Just denote $ W = aK + bI$, then $$ W ^{-1} = (aK + bI )^{-1} = (aP\Lambda P^T + bP P^T)^{-1}=(P(a\Lambda+bI)P ^ T)^{-1} = P (a\Lambda+bI)^{-1}P^T, $$ hence, $$ W ^{ -1/2} = P(a\Lambda + bI )^{-1/2}P^T. $$ Namely, the transformation of the WLS is $$ W ^{-1/2}Y = P(a\Lambda+ bI)^{-1/2}P^T Y. $$ Verification \begin{align} \operatorname{var}(W ^{-1/2}Y) &= W ^{-1/2} (aK + bI) W^{-1/2}\\ &= P(a\Lambda + bI )^{-1/2}P^T P(a\Lambda + bI )P^T P(a\Lambda + bI )^{-1/2}P^T \\ &= PP^T\\ &=I. \end{align}

So, $$ \hat{\beta} = (X'W^{-1}X)^{-1}X'W^{-1}y $$ is the WLS.

0
On

It should be identical to your second case (edit: which is incorrect as you have written, see the other answer for the correct $\hat \beta$). Let $W = aK+bI$.

This is generalized least squares (https://en.wikipedia.org/wiki/Generalized_least_squares).