Formula for the error terms squared of a linear regression model

89 Views Asked by At

I need to prove that the sum of the error terms squared $\sum\epsilon^2$ of a linear model is equal to a given formula:

$\sum_{i=1}^{n}(Y_i-\alpha-\beta(x_i-\bar{x}))^2=n(\hat{\alpha}-\alpha)^2+(\hat{\beta}-\beta)^2(\sum_{i=1}^{n}(x_i-\bar{x})^2)+\sum_{i=1}^{n}(Y_i-\hat{\alpha}-\hat{\beta}(x_i-\bar{x}))^2$

If we develop

$\sum_{i=1}^{n}(Y_i-\alpha-\beta(x_i-\bar{x}))^2=\sum_{i=1}^{n}((\hat{\alpha}-\alpha)+(\hat{\beta}-\beta)(x_i-\bar{x})+(Y_i-\hat{\alpha}-\hat{\beta}(x_i-\bar{x}))^2$,

the result we are trying to prove is fairly obvious. Indeed, we get

$n(\hat{\alpha}-\alpha)^2+(\hat{\beta}-\beta)^2(\sum_{i=1}^{n}(x_i-\bar{x})^2)+\sum_{i=1}^{n}(Y_i-\hat{\alpha}-\hat{\beta}(x_i-\bar{x}))^2 \mathbf{+2\sum_{i=1}^{n}((\hat{\alpha}-\alpha)(\hat{\beta}-\beta)(x_i-\bar{x})+(\hat{\alpha}-\alpha)(Y_i-\alpha-\beta(x_i-\bar{x}))+(\hat{\beta}-\beta)(x_i-\bar{x})(Y_i-\alpha-\beta(x_i-\bar{x}))}$.

How can I show that the last sum (in bold) is equal to zero, therefore completing my proof?

By the way, the long equations can make my question hard to understand. Therefore, do not hesitate to ask me to clarify something!