Proof Verification: $\tilde{\beta_1}$ is an unbiased estimator of $\beta_1$ obtained by assuming intercept is zero

9.5k Views Asked by At

Consider the standard simple regression model $y= \beta_o + \beta_1 x +u$ under the Gauss-Markov Assumptions SLR.1 through SLR.5.
Let $\tilde{\beta_1}$ be the estimator for $\beta_1$ obtained by assuming that the intercept is 0. Find $E[\tilde{\beta_1}]$ in terms of the $x_i$, $\beta_0$, and $\beta_1$. Verify that $\tilde{\beta_1}$ is an unbiased estimator of $\beta_1$ obtained by assuming intercept is zero. Are there any other cases when $\tilde{\beta_1}$ is unbiased?

Proof:

We need to prove that $E[\tilde{\beta_1}] = E[\beta_1]$

Using least squares, we find that $\tilde{\beta_1} = \dfrac{\sum{x_iy_i}}{\sum{(x_i)^2}}$

Then, $ \tilde{\beta_1} = \dfrac{\sum{x_i(\beta_0 +\beta_1x_i +u)}}{\sum{(x_i)^2}}$

$\implies \tilde{\beta_1} = \beta_0\dfrac{\sum{x_i}}{\sum{(x_i)^2}} +\beta_1 +\dfrac{\sum{x_iu_i}}{\sum{(x_i)^2}}$

Taking Expectayion on both sides:

$\implies E[\tilde{\beta_1}] = \beta_0E[\dfrac{\sum{x_i}}{\sum{(x_i)^2}}]+ \beta_1 +\dfrac{\sum{E(x_iu_i)}}{E[\sum{(x_i)^2}]}$ (since summation and expectation operators are interchangeable)

Then, we have that $E[x_iu_i]=0$ by assumption (results from the assumption that $E[u|x]=0$

$\implies E[\tilde{\beta_1}] = \beta_0E[\dfrac{\sum{x_i}}{\sum{(x_i)^2}}]+ \beta_1 +0$

Now, the only problem we have is with the $\beta_0$ term.

If we have that $\beta_0 =0$ or $\sum{x_i}=0$, then $\tilde{\beta_1}$ is an unbiased estimator of $\beta_1$/

Can anyone please verify this proof? Also, why don't we write $y= \beta_1x +u$ instead of $y= \beta_0 +\beta_1x +u$ if we're assuming that $\beta_0 =0$ anyway?

Please let me know if my reasoning is valid and if there are any errors.

Thank you.

EDIT:

Here's where I got the slope estimate from:

enter image description here

1

There are 1 best solutions below

4
On

You are to show $E(\tilde\beta_1)=\beta_1$ but your formula for $\tilde\beta_1$ is not correct. It should be: \begin{align*} \tilde\beta_1&=\frac{\sum_i(x_i-\bar{x})y_i}{\sum_i(x_i-\bar{x})^2}\quad\text{where}\quad\bar{x}=\frac{1}{n}\sum_ix_i. \end{align*} Now, first, observe $\sum_i(x_i-\bar{x})^2=\sum_i(x_i-\bar{x})x_i$. Second, using $y_i=\beta_0+\beta_1x_i+u_i$, we have: $$ \sum_i(x_i-\bar{x})y_i=\sum_i(x_i-\bar{x})(\beta_0+\beta_1x_i+u_i)=\beta_1\sum_i(x_i-\bar{x})x_i+\sum_i(x_i-\bar{x})u_i. $$ Therefore, $$ \tilde\beta_1=\beta_1+\frac{\sum_i(x_i-\bar{x})u_i}{\sum_i(x_i-\bar{x})^2}=\beta_1+\sum_i\lambda_iu_i,\quad\text{with}\quad\lambda_i=\frac{x_i-\bar{x}}{\sum_j(x_j-\bar{x})^2}. $$ Under the Gauss-Markov's assumptions, the $\lambda_i$ are not random. And so: $$ E(\tilde\beta_1)=\beta_1+E\left(\sum_i\lambda_iu_i\right)=\beta_1+\sum_i\lambda_iE(u_i)=\beta_1. $$