Variance of a slope estimator for a linear model

119 Views Asked by At

I am self-studying a textbook, and attempting an exercise in which the standard assumptions for a linear model apply, but not necessarily normality of the error term: $\mathbb{E}[\epsilon_i]$ = 0, $\mathbb{E}[\epsilon_i^2] = \sigma^2, \mathbb{E}[\epsilon_i \epsilon_j] = 0$ and $y_i = \alpha + \beta x_i + \epsilon_i$. From this it follows that $var(y_i) = \sigma^2$.

In such a setting, we known the OLS estimator for the slope $\beta$ is equal to:

$$ b = \frac{\sum(x_i - \bar{x})(y_i - \bar{y})}{\sum(x_i - \bar{x})^2} $$

and its variance is equal to:

$$ var(b) = \frac{\sigma^2}{\sum(x_i - \bar{x})^2} $$

In the exercise, I am asked to calculate the variance of another (unbiased) estimator, $b_1 = \left( \frac{y_n - y_1}{x_n - x_1} \right)$, and show that the variance of this estimator $var(b_1) \geq var(b)$. The variance I calculated is:

$$ var(b_1) = var(\frac{y_n - y_1}{x_n - x_1}) = \frac{1}{(x_n - x_1)^2} var(y_n - y_1) = \frac{1}{(x_n - x_1)^2} [var(y_n) + var(y_1)] = \frac{2\sigma^2}{(x_n - x_1)^2} $$

Is the variance that I calculated correct, and how do I show that this variance $var(b_1)$ is $\geq$ $var(b)$?

Many thanks for your help and assistance!