In my statistics textbook there is the following exercise:
For $n=8$ and $\sum x=552$ and $\sum x^2=48000$ calculate $S^2=\frac{\sum x^2}{n}-\bar{x}^2$.
I'm coming from a probability background so I'm guessing from context that $\bar{x}$ is the expected value of $x$ and $S^2$ is the variance of $x$. But what is the connection between $\sum x$ and $\bar{x}$? How to calculate $\bar{x}$? What does $\sum$ mean without limits? Are the limits of $\sum x$ the same as $\sum x^2$?
Comment: Let data be $x_1, x_2, \dots, x_n.$ Then
$\sum x$ is shorthand for $\sum_{i=1}^n x_i,$ frequently used when one specific sample is at hand. Then the sample mean is defined as $\bar x =\frac 1n \sum_{i=1}^n x_i.$ (It is best to reserve the word expectation for population means and means of random variables.)
The sample variance is defined as $S^2 = S_x^2 = \frac{1}{n-1}\sum_{i=1}^n (x_i - \bar x)^2.$
An alternative formula for the population variance is $$S^2 = \frac{\sum_{i=1}^n x_i^2\; -\;\frac 1n\left(\sum_{i=1}^n x_i\right)^2 }{n-1} = \frac{\sum_{i=1}^n x_i^2\; -n\bar x^2 }{n-1}.$$
Note: A relatively few textbooks use $n$ instead of $n-1$ in the denominator of $S^2.$ Yours seems to be one of them.
I guess this is intended as a momentary simplification, perhaps to avoid explaining to beginners why to use $n-1$ instead of $n?$ (One good answer is to have $E(S^2) = \sigma^2.)$
This alternative definition is not "wrong." And there are some technical reasons for making this choice. However, later on this means the text has a lot of explaining to do about being out of synch with the rest of the statistical world. (If it's an elementary text, there is no 'later on' and only the inquisitive students are inconvenienced.)