If I'm not mistaking standard deviation is defined as $\sigma = \sqrt{\frac{1}{N} \sum_{i=1}^N (x_i - \mu)^2}$ .
You could rewrite this as $\sigma = (\frac{1}{N} \sum_{i=1}^N |x_i - \mu|^c)^\frac{1}{c}$ where $c=2$.
This makes me wonder: What is special about 2 ?
Why don't we ever use any other value for $c$ ?
I'm probably mistaking but to me it seems that the only effect of another $c$ would be that the average distance of the mean would be either more or less visible (depending on whether you choose a larger or smaller $c$).
Or is $c=2$ just a convention because $\sqrt{x}$ is easier to calculate then (just a random example): $x^\frac{1}{8.15}$