Computing std dev given rms and mean

289 Views Asked by At

Say I have the rms of a dataset with values $x_i$, which was calculated as

$$ rms = \sqrt{\frac{1}{N}\sum x_i^2} $$

and a mean that was calculated as

$$ \mu = \frac{\sum x_i}{N} $$

Given these two pieces, I'd like to compute the standard deviation $\sigma$ (without accessing the original data $x_i$). This seems simple:

$$ \begin{align} \sigma^2 &= \frac{1}{N}\sum(x_i-\mu)^2 \\\\ &= \frac{1}{N}\sum(x_i^2 - 2x_i\mu +\mu^2) \\\\ &=\frac{1}{N}\left(\sum x_i^2 - \sum 2x_i\mu + N\mu^2 \right)\\\\ &= \frac{1}{N}\left(N(rms)^2 - 2N\mu^2 + N\mu^2\right)\\\\ &= (rms)^2 - \mu^2 \\\\ &\implies \sigma = \sqrt{(rms)^2 - \mu^2} \end{align} $$

I think this seems correct, and it agrees with the form shown here. However, I found a discussion on StackOverflow where one user suggests that it should be as simple as subtracting $\mu$ from the rms to recover $\sigma$, and another says that no, you need access to the individual values to be able to do this. Both of them seem to be wrong, given my calculation.

Unless what I've done has some hidden assumptions. Will this only work if the data is unweighted? This is true in my case. Are there any other implicit assumptions I'm making?

1

There are 1 best solutions below

0
On

I think it is because $$\text{Var}(X) = \mathbb{E}[X^2] - \mathbb{E}[X]^2.$$ So the rms is used to find $\mathbb{E}[X^2]$, and then the mean is used to find $\mathbb{E}[X]^2.$ Then, you can use that the standard deviation is just the square root of variance.