How to calculate expected value of estimator $b_1$ for linear model

232 Views Asked by At

For a linear model $y_i=\beta_0+\beta_1x_i+\epsilon_i$, we have estimators

$b_0=\bar{y}-b_1\bar{x}$

$b_1=\frac{S_{xy}}{S_{xx}}$ where S is the sum of squares.

I want to show that both $b_0$ and $b_1$ are unbiased, i.e. $E(b_0)=\beta_0$ and $E(b_1)=\beta_1$. Let us start with $b_1$:

$E(b_1)=E(\frac{S_{xy}}{S_{xx}})$

Apparently, the next step is

$E(b_1)=\frac{1}{S_{xx}}E(S_{xy})$

How can one take out the $\frac{1}{S_{xx}}$ from the expected value but not the $S_{xy}?$ I'm just wondering how this works.

I'm aware that

$\frac{S_{xy}}{S_{xx}}=\frac{\sum{x_i(y_i-\bar{y})}}{\sum{x_i(x_i-\bar{x})}}$