The steps to get $\beta$ in Least Square Estimation, why $x_i$ was removed?

355 Views Asked by At

Please see the following steps

Get $\beta$ in Least Square Estimation:

enter image description here

1

There are 1 best solutions below

3
On

$\sum\limits_{i=1}^n y_i-\bar{y}+\hat{\beta}(\bar{x}-x_i) =0 \implies \sum\limits_{i=1}^n (y_i-\bar{y}) = - \hat{\beta}\sum\limits_{i=1}^n(\bar{x}-x_i) \implies \hat{\beta} = \dfrac{\sum\limits_{i=1}^n (y_i-\bar{y}) }{\sum\limits_{i=1}^n(x_i-\bar{x}) }$ looks a strange thing to say this way since I would have thought $\sum\limits_{i=1}^n (y_i-\bar{y})=0$ and $\sum\limits_{i=1}^n (x_i-\bar{x})=0$ by the definition of $\bar{y}$ and $\bar{x}$. Saying $\hat{\beta} =\frac00$ would not be helpful.

That being said, I would have thought you can say $\sum\limits_{i=1}^n y_i-\bar{y}+B(\bar{x}-x_i) =0$ was obviously true for any $B$ and this implies $\sum\limits_{i=1}^n y_i-\bar{y}+\hat{\beta}(\bar{x}-x_i) =0$ and so $\sum\limits_{i=1}^n \left(y_i-\bar{y}+\hat{\beta}(\bar{x}-x_i)\right)\bar{x} =0$. Subtract that from the earlier $\sum\limits_{i=1}^n \left(y_i-\bar{y}+\hat{\beta}\bar{x}-\hat{\beta}x_i\right)x_i =0$ and you can get the final line and the desired conclusion.