Deriving variance from the expected deviation from the mean of a normal distribution

42 Views Asked by At

I know the expected absolute deviation from the mean of a normal distribution $E[|X-\mu_x|]$. From this I want to derive the variance $\sigma^2$ of said distribution. This is done to tune a filter of $X$ that expects an estimate of the current variance at each time step. I don't know the sign of the deviation, just the magnitude.

I identified two approaches to derive the variance, but I am unsure which one is the better estimate of the variance:

Approach 1 - Re-arranging the mean absolute deviation (MAD):

Here I interpret $E[|X-\mu_x|]$ as the known expected deviation from the mean. For a normal distribution: $$E[|X-\mu_x|] = \sigma_x \sqrt{\frac{2}{\pi}}$$

The MAD can be re-arranged to yield the variance:

$${\sigma_x}^2 = \frac{\pi}{2} \cdot {E[|X-\mu_x|]}^2$$

Approach 2 - Calculating ${\sigma_x}^2$ directly from one sample

$${\sigma_x}^2 = \frac{1}{n} \sum_{i=1}^n (X_i-\mu_x)^2$$

with n=1:

$${\sigma_x}^2 = (X_i -\mu_x)^2$$

Which approach is valid? The argument for approach 1 seems to be straight forward. In approach $2$ I can't even apply Bessel's correction without dividing by zero (And I don't know the mean, just the deviation from it!).

Edit:

To clarify my use case: I know the deviation, which is derived from a separate measurement. I trust this measurement to be the expected deviation (mean absolute deviation). I do not know the sign of the deviation and as such can not directly correct X. Instead I am filtering it with a variable estimate of the variance.