Given the mean and the range of a normal distribution to find the variance

981 Views Asked by At

I am wondering is it possible to estimate the variance based on the mean and the range (max, min) of a normal distribution. I only need an approximated result.

1

There are 1 best solutions below

1
On

The two parameters that specify a normal distribution are mean and variance. There is no such thing as a "range". Or, rather, there is: It's negative infinity to positive infinity. You can see that each of the distributions below have infinite range, so the question in its original form doesn't make much sense.

enter image description here

If you have samples from a distribution, the range will also be a pretty poor indicator of variance - you need the actual numbers. With a bunch of samples, you can calculate the variance with the standard statistics formula here.