Thorem stating how much of a population is within $n$ standard deviations from the mean

229 Views Asked by At

In a statistics class I took in college, I remember learning about a theorem stating that $50\%$ of the population must be within $1\sigma$ from $\mu$, $75\%$ within $2\sigma$, and so on, regardless of the distribution. The distribution can be tighter, but this much is a guarantee.

People don't seem to be familiar with this, I'm not sure I remembered the numbers right, and I'm not sure this even exists. Is this a thing?

2

There are 2 best solutions below

0
On BEST ANSWER

Yes, some of it is true, and comes from Tschebychev inequality. It says that $$P(|X-\mu|\le n\sigma)\ge 1-\frac1{n^2}.$$ This gives the mentioned $0.75$ for $n=2$, but the $0.5$ actually appears if you take $n=\sqrt2=1.41\ldots$. It says nothing for $n=1$.

This is valid for any distribution, provided it has a well defined finite mean and a finite variance/s.d. For other distributions exact values can be determined, which are necessarily equal or greater (quite bigger, for most usual distributions) than those given by this theorem.

0
On

You're thinking of Chebyshev's inequality.

For any $k$, in any distribution with a mean and standard deviation, the probability of being more than $k$ standard deviations away from the mean is no more than $\frac1{k^2}$.

Most distributions, of course, are much tighter than this; the theorem is the worst case.