I've been preparing for an exam tomorrow and was doing a past paper when it appears the mark scheme has an answer that I just cannot fathom.
The exact question is to "find a 95% confidence interval for the true mean time for the task".
The only data I've been given are
$n=80$,
$Σx=555.20$, and
$Σx^2=3863.9031$
I know I can calculate the sample mean using $\bar{x} = Σ\frac{x}{n}$, which gives me $6.94$. From my understanding, the sample mean is calculated by doing $\frac{Σx^2-(\bar{x})^2}{n-1}$, but this has given me a value of $6.94985611$ which is different from the answer given, stated as $s = 0.37$ (and $s^2 = 0.1369$).
So, how do I arrive at an answer of $0.37$?
(Paper reference: June 2013 MEI Statistics 3 Question 1 ii)
To avoid square roots for the moment, the definition of the sample variance is $$S^2 = \frac{\sum (X_i - \bar X)^2}{n-1}.$$
After a little algebra, the numerator can be written in several ways: $$(n-1)S^2 = \sum (X_i - \bar X)^2 = \sum X_i^2 - n\bar X^2 = \sum X_i^2 - \frac{(\sum X_i)^2}{n}.$$
Your expression (Σx^2-xbar^2)/(n-1) seems to use the second of these, but with a missing factor of $n$ before $\bar X^2$.
Using this second expression, one has $$S^2 = \frac{3863.9031 - 80(6.94)^2}{79} = 0.1369,$$ and thus $S = \sqrt{0.1369} = 0.37.$
Then a 95% confidence interval is $\bar X \pm t^*S/\sqrt{n}$ or $6.94 \pm 1.99045(0.37)/8.9443,$ which is $6.940 \pm 0.0823,$ using the t distribution with 79 degrees of freedom.
In this expression, the factor $S/\sqrt{n}$ is the (estimated) standard error of the mean; that is, $SD(\bar X) = \sigma/\sqrt{n}$, which is estimated by $S/\sqrt{n} = 0.370/\sqrt{80} = 0.0414.$ (Sometimes, especially in software output, the term 'estimated standard error' is abbreviated to 'standard error' or just 'SE' when it is clear that $\sigma$ is unknown.)
Just to check my computations, I put $n, \bar X,$ and $S$ into Minitab, one of the statistical packages that accepts summarized data, with the following result (consistent within rounding error):