Compare variances of two estimators of a linear regression

115 Views Asked by At

I am given the model $Y_k = a + bX_k + \epsilon_k$ where $k = 1,..., n$ (assume that variance of error term is $\sigma^2$). I need to compare the variance of estimator $\hat{b} = \frac{1}{n}\sum_{k=1}^n \frac{Y_k - \bar{Y}}{X_k -\bar{X}}$ and the variance of the OLS estimator for beta. It seems that I've managed to calculate the variance of $\hat{\beta}$ and it appeared to be zero. But intuitively I think it cannot be zero.

1

There are 1 best solutions below

0
On BEST ANSWER

Let's calculate the variance more carefully:
$$ \operatorname{Var}(\hat{\beta}) = \operatorname{Var} \frac{1}{n}\sum_{k=1}^n \frac{Y_k - \bar{Y}}{X_k -\bar{X}} = \frac{1}{n^2} \sum \frac{1}{(X_k -\bar{X})^2} \operatorname{Var}(a + bX_k + \epsilon_k - \frac{\sum a + bX_k + \epsilon_k}{n}) = \frac{1}{n^2} \sum \frac{1}{(X_k -\bar{X})^2} \operatorname{Var}( \epsilon_k - \sum \epsilon_t /n)$$

The last expression is not zero.

Hint
In order to compare it with the variacne of OLS estimator use the Cauchy–Schwarz inequality.