What is the difference between these formulas for standard deviation?

97 Views Asked by At

Throughout my entire math career, I have seen this formula for standard deviation $$ \sigma = \sqrt{\dfrac{\Sigma_{i=1}^{N}{(X_i-\mu)^2}}{N}} $$

However now we are going over expected values, means, and standard deviations in probability theory class and learned this formula

$$ \sigma_x = \sqrt{E(X^2) - \mu_x^2} $$

How is one formula the sum of the differences of the actual value, whereas the other is the difference with the expected value of the variable squared? Are these two formulas equivalent or have I made a mistake?

1

There are 1 best solutions below

2
On

It is $$\mu = E(X)$$ and $$ \sigma^2 = \operatorname{Var}(X) = E[(X-\mu)^2] = E(X^2)-\mu^2, $$ $$ \sigma = \sqrt{\operatorname{Var}(X)} = \sqrt{E[(X-\mu)^2]} = \sqrt{E(X^2)-\mu^2}. $$ The expression $$\dfrac{\Sigma_{i=1}^{N}{(X_i-\mu)^2}}{N}$$ is an estimator of the variance of $X,$ namely it is the sample mean of the squared deviations $(X_1-\mu)^2,\ldots,(X_n-\mu)^2$ of the observations $X_1,\ldots,X_n$ from the true mean $\mu$. Your expression $$\sqrt{\dfrac{\Sigma_{i=1}^{N}{(X_i-\mu)^2}}{N}}$$ is then an estimator of the standard deviation but it is not the "true" standard deviation.