Why _doesn't_ the Standard Deviation of a set of observations divide by n-1?

79 Views Asked by At

I'm going to resist talking in terms of sample/population. I have set of N observations and I'm trying to understand why its standard deviation around the mean uses a n-1 divisor.

Say my set is {1,2,3}. The mean of this set is 2. It seems to me that there are only 2 degrees of freedom for (x - mean), because the third deviation must be the -1*(sum of the other two). There are only 2 independent variables.

Edit: I understand Bessel's correction and why the SD of a sample doesn't divide by n-1. I'm asking the opposite. Why is it that the SD of a population divides by n, even though we can "derive" the last deviation using the others.

1

There are 1 best solutions below

2
On

Without more context, there’s no reason to think about degrees of freedom. The standard deviation is simply the square root of the average squared deviation from the mean. Mathematically, there’s nothing more to say.