When talking about simple linear regression, it seems that the maximum value for the $R^{2}$ of the regression is always less than $1$.
Now of course if we are to take the formula on how to calculate the $R^{2}$, it is just $1$ - SSR/SST and the SSR is always going to be greater than $0$.
Is there any way to formally prove this assertion?
In a simple linear regression $R^2$ is the same as $r^2$, i.e., the Pearson's correlation coefficient. As such, whenever there is strict linear relationship between $X$ and $Y$, you'll get $R^2 = 1$. Because, if $Y= a+ bX$ then you have no residuals, as $\hat{a} = a$ and $\hat{b} = b$, namely $SSRes = 0$. Another redundant case is whenever you have only $2$ data points. In this case you are able to draw straight line between the points with $b = (y_1 - y_2)/(x_1 - x_2)$ and $a = y_1 - b x_1$, thus once again - no residuals and $SSRes = 0$, hence $R^2 = 1$. For any other configuration with $n > 2$ and non degenerate data, $R^2 < 1$. The last result follows from Cauchy-Schwartz inequality as $$ r^2 = \frac{ ( \sum(x-\bar x )(y - \bar y))^2}{\sum ( x - \bar{x})^2 \sum( y - \bar{y})^2}, $$ where $ \left( \sum(x-\bar x )(y - \bar y) \right)^2 \le\sum ( x - \bar{x})^2 \sum ( y- \bar y )^2$, so $r^2 \le 1$.