To derive least square estimators:
We have $SS(\alpha,\beta)= \sum(y_i-\alpha-bx_i)^2$ and find partials for each. The answer I get is:
$\beta = \frac{\sum y_i-\bar{y}}{\sum x_i-\bar{x}}$, but the book's solution "has an easier to work with form" $\beta = \frac{\sum (y_i-\bar{y})(x_i-\bar{x})}{\sum (x_i-\bar{x})^2}$.
What is the reasoning for writing it in such a way?
Edit: By taking partial derivatives with respect to $\alpha$ and setting equal to zero. I get $\alpha = \bar{y}-\beta\bar{x}$. When I did the partial derivative with respect to $\beta$, I get $-2(\sum y_i - \alpha - \beta x_i)x_i = 0,$ I reorganized terms distributed $x_i$ and got $\sum(y_i-\bar{y} )x_i = \beta \sum(x_i - \bar{x})x_i$ and cancelled the $x_i$ term to solve for $\beta$.
So your $\beta= \dfrac{\sum(y_i-\bar y)x_i}{\sum(x_i-\bar x)x_i} =\dfrac{\sum(y_i-\bar y)(x_i-\bar x)}{\sum(x_i-\bar x)^2}$
Since $\sum(y_i-\bar y)\bar x=0$ and $\sum(x_i-\bar x)\bar x=0$.
EDIT: $\sum(y_i-\bar y)\bar x= \bar x\sum(y_i- \bar y)\\=\bar x[\sum y_i-\sum \bar y]\\=\bar x[n\bar y-n\bar y]=0 $