Question regarding linear regression weighting matrix

49 Views Asked by At

Consider the linear regression model

$$b = Xy + e, \quad E[e] = 0, \quad E[ee'] = V$$

Assume that the matrix $X$ has linearly independent columns. It is well known that the minimum variance affine unbiased estimator of $y$ is

$$\hat{y} = (X'WX)^+ X'Wb $$

where superscript + denotes Moore-Penrose inverse and $W$ is the optimal weighting matrix is

$$W = (V + XTX')^+$$

where $T$ may be any positive semidefinite matrix such that

$$\text{col}\, X \subseteq \text{col}\, (V + XTX')$$

I am trying to directly verify that the quantity $(X'WX)^+X'W$ is independent of the choice of $T$, so long as $T$ meets the above conditions. That is, I want to show

$$[X'(V + XX')^+X]^+X'(V + XX')^+ = [X'(V + XTX')^+X]^+X'(V + XTX')^+$$

I have been unable to arrive at a proof for this. Can anyone help prove the result, or provide a counterexample?