Linear regression: b1 has the minimum variance among all unbiased linear estimators of beta1

1.7k Views Asked by At

There is a proof provided in Applied Linear Regression Models (1983) by Kutner et al. (Page 64), which is quite clear and easy to understand, except one point, namely, it assumes that $\sum k_i d_i = 0$, from the "restrictions on $k_i$ and $c_i$", without expounding on what restrictions these are. Could anyone please explain? Thanks!

enter image description here

1

There are 1 best solutions below

1
On

I suspect that by "restrictions on $k_i$ and $c_i$" what is meant is that the estimator must be unbiased (that's one restriction) and that $k_i$ are the least-squares estimators (that's the other). The first restriction was stated after the words "Since $\hat\beta_1$ must be unbiased". In effect $\sum_i d_i Y_i$ would be an unbiased estimator of $0$.

The way I've seen this result proved before relies on a lemma that shows that the least-squares estimators are uncorrelated with every linear unbiased estimator of $0$.