I am having trouble proving the identity below.
$E[\Sigma(y_i-\bar{y})^2]=(n-1)\sigma^2 +\beta_1^2\Sigma(x_i-\bar{x})^2$
where the assumptions are
$Cov[y_i,y_j]=0$ for $i \ne j$
$E[y_i]=\beta_0+\beta_1x_i, Var[y_i]=\sigma^2$
$\hat\beta_0$ and $\hat\beta_1$ are least squares estimate of $\beta_0$ and $\beta_1$
So far I understand that $$E[\hat\beta_1^2]=\beta_1^2 +\frac{\sigma^2}{\Sigma(x-\bar x)^2}$$
but I seem to really have issue understanding the relationship between x and y :(
I am thinking that
$$E\left[n \frac{1}{n}\Sigma(y_i-\bar{y})^2 \right]=n E[Var[Y_i]]= n\sigma^2$$ which looks nothing like the expression...
May I get some help, please?
\begin{align} \mathbb{E}\sum (y_i - \bar{y} ) ^2&= \mathbb{E}\sum y_i^2 - n\mathbb{E}\bar{y} ^2\\ &=nVar(y_i)+\sum\mathbb{E}^2(y_i)-n(Var(\bar{y})+\mathbb{E}^2(\bar{y}))\\ &=n\sigma^2 + n \beta_0^2 + 2\beta_1 \sum x_i + \beta_1^2\sum x^2_i -\sigma^2 - n\beta_0^2 - 2\beta_1\bar{x} - \beta_1^2n\bar{x}^2\\ &= (n-1)\sigma^2 +\beta_1^2(\sum x_i^2 - n \bar{x}^2)\\ &= (n-1)\sigma^2 +\beta_1^2\sum (x_i - \bar{x})^2. \end{align}