I am trying to prove that the estimated variance of the residual
$$\hat \sigma ^2 = \frac{\Sigma(y_i-\hat{y_i})^2}{n-2}$$
is an unbiased estimator of the variance of the error $\sigma^2$.
So far what I know is that
$$\hat{y_i} = \hat \beta_0 + \hat \beta_1 x_i $$ and with help I was able to prove the property $$E[(y_i-\bar y)^2] = (n-1)\sigma^2+\beta_1^2 \Sigma(x_i-\bar x)^2$$ I also expanded the expression and played around with the $\Sigma E[y_i]$s and $\Sigma E[y_i^2]$s, but was not sure how to manipulate the $\Sigma x_i$s...
can I get some help, please?
Note that $\{y_i\}_{i=1}^n$ are iid, where $y_i \sim N(\beta_0 + \beta_1x_i, \sigma^2)$, hence $\hat{y}_i = \hat{\beta_0} + {\beta_1}x_i$ is the estimator of $\mathbb{E}y_i$, thus $$ \sum_{i=1}^n\frac{(y_i - \hat{y}_i)^2}{\sigma^2} \sim \chi^2_{n-2}, $$ two degrees of freedom are "lost" due to estimation of $\beta_0$ and $\beta_1$. Thus, $$ \mathbb{E}\sum_{i=1}^n\frac{(y_i - \hat{y}_i)^2}{n-2} =\frac{\sigma^2}{n-2}\mathbb{E}\sum_{i=1}^n\frac{(y_i - \hat{y}_i)^2}{\sigma^2} = \frac{\sigma^2(n-2)}{n-2} = \sigma^2. $$