I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:
$SQ(\alpha, \beta)=\sum\limits_{i=1}^n\epsilon_i^2=\sum\limits_{i=1}^n(y_i-\alpha-\beta x_i)^2$
$\frac{\partial SQ(\alpha, \beta)}{\partial \alpha}\vert_{\alpha= \hat{\alpha}} = 0 \implies \sum\limits_{i=1}^n(y_i - \hat{\alpha} - \hat{\beta} x_i) = 0 $
$\frac{\partial SQ(\alpha, \beta)}{\partial \beta}\vert_{\beta= \hat{\beta}} = 0 \implies \sum\limits_{i=1}^n x_i(y_i - \hat{\alpha} - \hat{\beta} x_i) = 0$
$\sum\limits_{i=1}^n y_i = n \hat{\alpha} + \hat{\beta} \sum\limits_{i=1}^n x_i \implies \hat{\alpha} = \bar{y} - \hat{\beta}\bar{x}$
$\sum\limits_{i=1}^n x_i y_i = \hat{\alpha} \sum\limits_{i=1}^n x_i + \hat{\beta} \sum\limits_{i=1}^n x_i^2 \implies \hat{\beta} = \frac{\sum\limits_{i=1}^n x_i y_i - n\bar{x}\bar{y}}{\sum\limits_{i=1}^n x_i^2 - n\bar{x}^2}$
However, my result of the partial derivative with beta fixed is $\hat{\beta} = \frac{\sum\limits_{i=1}^nx_iy_i - \bar{x}\bar{y}}{\sum\limits_{i=1}^nx_i^2-\bar{x}^2}$
What is going on? What did I miss?
It might be due to missing parentheses in your attempt: $$\sum_{i=1}^n (x_i^2 - \bar{x}^2) = \left(\sum_{i=1}^n x_i^2\right) - n \bar{x}^2.$$
By substituting $\hat{\alpha} = \bar{y} - \hat{\beta} \bar{x}$ and $\sum_i x_i = n \bar{x}$ into the last equation we have \begin{align} \sum_i x_i y_i &= (\bar{y} - \hat{\beta} \bar{x}) \sum_i x_i + \hat{\beta} \sum_i x_i^2 \\ \sum_i x_i y_i &= n \bar{x}\bar{y} + \hat{\beta} ((\sum_i x_i^2) - n \bar{x}^2) \end{align}