What does the "standard" in "standard deviation" mean?

93 Views Asked by At

I know what the standard deviation is. However, I can't make sense of its name. What does the word "standard" refer to?

  • Is it a synonym for mean, so that the "standard deviation" is the "deviation from the mean value"?
  • Is it a methodology, so that the "standard deviation" is the "standard way of determining the deviation" from the mean?

Wikipedia says that the term standard deviation was coined by Karl Pearson in 1894. As far as I could see, his cited article Contributions to the Mathematical Theory of Evolution does indeed introduce the term, but doesn't comment on the chosen name.

2

There are 2 best solutions below

0
On

As noted in the discussion @lulu linked to, Karl Pearson coined the term without explaining the choice. But whether or not he wanted to call it the standard deviation for the reason your second bullet-point guessed, it's certainly earned that motivation. There are a number of other deviations in statistics, but the standard deviation has several advantages:

  • It has a natural interpretation in terms of a quotient space of variables;
  • It's easy to calculate for a lot of interesting probability distributions, and well-defined for an even broader range of them (this alternative requires much more of a distribution to be well-defined);
  • It's the square root of the variance, which is already interesting because it's the second cumulant, and nicely expressed in terms of various generating functions in statistics;
  • Central limit theorems, including the classical one, motivate an interest in Gaussian distributions, which are characterized by their mean and variance as parameters of a linear transformation reducing them to the standard Gaussian distribution;
  • The variance's generalization to multivariate statistics is easily studied with linear algebra;
  • If each observation in a dataset has Gaussian noise of the same variance, maximum likelihood estimation is equivalent to OLS, again easily studied with linear algebra.
0
On

If $x$ is normally distributed with mean $\mu$ and standard deviation $\sigma$, then $pr[ \mu - \sigma \le x \le \mu + \sigma] = .68$, $pr[ \mu - 2\sigma \le x \le \mu + 2\sigma] = .95$, and $pr[ \mu - 3\sigma \le x \le \mu + 3\sigma] = .99$. So it provides a useful rule of thumb for thinking about how likely a realization of $x$ is to be within 1, 2, or 3 standard deviations of its mean in terms of $\sigma$. As .05 and .01 are common levels for tests, this is helpful for back-of-the-envelope calculations about confidence intervals.