As the question suggests, I'm used to $\overline{y}$ being $\frac{1}{n}\sum^n _{i=1} y_i$ but in the following proof, this seems to be not the case.
$$SS_{reg}=\sum^n_{i=1}(\hat{y}_i-\overline{y})^2=\sum^n_{i=1}[(\hat{\beta}_0+\hat{\beta}_1x_i)-(\hat{\beta}_0+\hat{\beta}_1\bar{x})]^2$$ $$=\sum^n_{i=1}\hat{\beta}_1^2(x_i-\bar{x})^2 = \hat{\beta}_1^2\sum^n_{i=1}(x_i-\bar{x})^2 =\hat{\beta}_1^2S_{xx}$$
and this result works out just fine. Is there some proof that $\overline{y}=\frac{1}{n}\sum^n_{i=1} y_i=\hat{\beta_0}+\hat{\beta_1}\bar{x}$ ?
In this case, I'm assuming that $\overline{x}=\frac{1}{n}\sum^n_{i=1}x_i $
For $y_i = \beta_0 + \beta_1x_i +\epsilon_i$, you have that the OLS estimators are $$ \hat{\beta_1}= \frac{\sum(y_i - \bar{y})(x_i - \bar{x})}{\sum(x_i - \bar{x})^2}\, ,\qquad \hat{\beta}_0 = \bar{y}- \hat{\beta}_1\bar{x}\, , $$ hence, if you plugging-in $x_0 = \bar{x}$ in the estimated equation, you'll have that $$ \hat{y}(x_0 = \bar{x})=\hat{\beta}_0 + \hat{\beta}_1\bar{x}=\bar{y}- \hat{\beta}_1\bar{x}+ \hat{\beta}_1\bar{x} = \bar{y}. $$