Unbiasedness of estimator in Linear Regression.

626 Views Asked by At

Suppose you have a standard linear regression model

$$y_i = \beta_0 + x_i\beta_1 + \epsilon_i$$

With an estimator $\hat{\beta}_1$ of $\beta_1$(which I believe is OLS)

$$\hat{\beta}_{1} = \frac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2}$$

How to show $\hat{\beta}_1$ is unbiased, I assumed by $E[\hat{\beta}_1] = \beta_1$.

My try:

First rewrite $\hat{\beta}_1$:

$\hat{\beta}_1 = \frac{\sum x_iy_i}{\sum x_i^2} = \frac{\sum x_i(\beta_0+x_i\beta_1+\epsilon_i)}{\sum x_i^2} = \frac{\beta_0\sum x_i + \beta_1\sum x_i^2 + \sum x_i \epsilon_i}{\sum x_i^2} = \beta_1 + \frac{\beta_0\sum x_i + \sum x_i \epsilon_i}{\sum x_i^2}$

But now I do not know how to continue, when taking the expectation of the latter we ofcourse get; $\beta_1 + something$. But since this is OLS (hence unbiased, I believe not sure) the $something$ should be equal to $0$. But how is this possible?

1

There are 1 best solutions below

0
On

The OLS estimate of $\beta_1$ is $\frac{\sum_{i=1}^n(x_i-\overline{x})(y_i-\overline{y})}{\sum_{i=1}^n(x_i-\overline{x})^2}$ which is unbiased as proved here.

  • When $\overline{x},\overline{y}=0$, your $\hat{\beta_1}$ is equal to this OLS estimate and hence, unbiased.
  • Otherwise,$\frac{\sum_{i=1}^n(x_i-\overline{x})(y_i-\overline{y})}{\sum_{i=1}^n(x_i-\overline{x})^2}\ne\frac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2}\left(=\hat{\beta_1}\right)$ because of the non variable terms in the relation (as you showed).

The conclusion being: your calculation is correct and $\hat{\beta_1}$ is an unbiased estimate only if it is equal to the OLS estimte, which happens only in case of zero intercept (read the image included in the question).