Suppose that we have independent samples ${(x_i,y_i ):i=1,⋯,n}$ which are assumed to follow $ y_i=β_0+β_1 x_i+ε_i $ where $\epsilon_i$ are i.i.d. from $N(0,\sigma^2)$ . Suppose that $b_0$ and $b_1$ are the least square estimators of $b_0$ and $b_1$ respectively. Define $y ̂_i=b_0+b_1 x_i$ and $e_i=y_i-y ̂_i$.
Prove each of the following:
- $$1/n \sum_{i=1}^n y ̂_i =y ̅ $$
$$\sum_{i=1}^n e_i =0$$
$$\sum_{i=1}^n (y ̂_i-y ̅ )e_i=0 $$
I know that I should use this fact but I could not figure it out till now:
$$ y ̂_i=b_0+b_1 x_i=y ̅+b_1 (x_i-x ̅) $$
Any advice will be greatly appreciated.
All these results stem from the first order condition, namely, as you are minimizing $$ \sum ( y_i - \beta_0 - \beta_1x_i) ^2, $$ then by the first derivative w.r.t. $\beta_0$ you have $$ -2\sum ( y_i - \hat \beta_0 - \hat \beta_1x_i)=0, $$ as $\hat y_i = \hat \beta_0 - \hat \beta_1x_i$, then $$ \sum ( y_i - \hat y_i)=0 \to 1/n \sum y_i = 1/n \sum\hat y_i. $$ Next, note that $e_i = y_i - \hat y_i$, thus $$ \sum ( y_i - \hat y_i)= \sum e_i = 0. $$ For the third one, use the fact that $\bar{y} =\hat \beta_0 + \beta_1 \bar x$, and $ \sum x_i ( y_i - \hat \beta_0 - \hat \beta_1 x_i ) = \sum x_i e_i =0 $, which stems from the partial derivative w.r.t. $\beta_1$.