Definition of Standard Deviation

59 Views Asked by At

We note that given a probability distribution function $P$ over a space $U$ the expected value of a function of the elements in U:

$$ E(f(x)) = \int_{U} f(x)P(x) $$

We thus consider the mean as the expected value of the numbers that is:

$$ E(x) = \int_{U} x P(x) $$

Now we consider "standard deviation" to be the expected difference between a variable from the mean that is

$$ Std(x) = E(|x - E(x)|) = E\left(\sqrt{(x - E(x))^2}\right) $$

Yet Standard deviation is always measured as:

$$ \sqrt{E((x - E(x)^2)} $$

The latter formula doesn't make sense to me. Can someone explain why mine is wrong and hte latter is corret?