Compare the variances of restricted and unrestricted estimators?

56 Views Asked by At

Problem
Given a linear model $y_i = \beta_1 + \beta_2 x_i +\epsilon_i, \quad i = 1, \dots, n$
I need to compare the variance ordinary least squares estimator of $\beta_2$ without the restrictions and the variance of ordinary least squares estimator of $\beta_2$ under linear restriction $\beta_1= 0$ (i.e. $\Bbb Var(\beta_2^R)$ and $\Bbb Var(\beta_2^U) $).
Is $\beta_2^R$ unbiased and are there any violations of Gauss–Markov theorem?

My ideas
Intuitively, the restricted variance should be less than the unrestricted variance because the restrictions always damage the flexibility of a model hence reducing the variance.
However, I need a mathematical proof for the problem above.

With some help I have obtained that $$ Var(\beta_2^{UR}) = \frac{\sum(y_i - \beta_1- \beta_2x_i)^2/(n-2)}{\sum(x_i - \bar{x})^2}$$ and $$ Var(\beta_2^{R}) = \frac{\sum(y_i - \beta_2x_i)^2/(n-2)}{\sum(x_i )^2}$$

1

There are 1 best solutions below

0
On BEST ANSWER

You can use this formula $\mathbb{V}(\boldsymbol{\hat{\beta}}) = \sigma^2 (X^\text{T} X)^{-1}$
$\sigma^2$ is the variance of the disturbance term i.e. $\Bbb Var(\epsilon_i) = \sigma^2 $

If the regression has an intercept, then $X$ has the following form: \begin{bmatrix} 1 & x_1 \\ 1 & x_2 \\ \vdots & \vdots \\ 1& x_n \end{bmatrix} If the regression has no intercept, then $X$ has the column form:

\begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix}

Therefore, using the formula above we obtain that

$$\Bbb {V}(\hat{\beta}_2^{UR}) = \frac{\sigma^2}{\sum x_i^2 - n \bar{x}^2}.$$

$$\Bbb {V}(\hat{\beta}_2^R) = \frac{\sigma^2}{\sum x_i^2}.$$

From these formulas the comparison is trivial.