Property of least squares estimates question

189 Views Asked by At

The assumption is the following. $$E[Y_i]=\beta _0 +\beta _{1}X_i, Var[Y_i]=\sigma ^2, Cov[Y_i,Y_j]=0, \forall i\ne j $$

Where $\hat \beta_0$ and $\hat \beta_1 $ are the least squares estimators.

I want to prove that $E[\hat \beta_1]=\beta_1$ and I am looking at my professor's work,

enter image description here

I get the first two rows, but I do not understand how he got the third row.

understand that

$$E[\Sigma (x_i-\bar{x})y_i]=\Sigma \left( E[(x_i-\bar{x})y_i] \right)$$

but I don't know how he got rid of the Expected value part.

May I get some help?

1

There are 1 best solutions below

0
On

$X$ is a variable, but not a random variable. To be more rigorous, I would prefer conditioning on $X$, i.e, \begin{align} \mathbb{E}[\hat{\beta}_1|X] &=\mathbb{E}\left(\frac{\sum(X_i - \bar{X})Y_i}{\sum (X_i - \bar{X})^2} |X=\mathrm{x} \right)\\ &= \left(\frac{\sum(X_i - \bar{X})(\beta_0 + \beta_1 X_i)}{\sum (X_i - \bar{X})^2} |X=\mathrm{x} \right)\\ &=\beta_1 \end{align}