History of Mathematical Formulas

378 Views Asked by At

I just wondered why in calculating something for example Variance we square the difference of the value and its Arithmetic Mean and do not take the absolute value of the difference? Are there books or resources that describe why a specific formulae ended up like this?

Variance here is just an example; if possible I need more general resources.

1

There are 1 best solutions below

2
On

Mathematician gave one example. There are a few other reasons I can think of:

  1. The moments of a distribution are defined in terms of powers of the random quantity. Therefore, you have your first moment ($E[X]$) for the "location", your second $(E[X^2])$ etc. The only reason we "center" a variable in the variance formula is to isolate the spread about its location. $E(|X|)$ is not a moment of a distribution.
  2. Analytically, $E(|X|)$ is not nice to work with, as the derivative is discontinuous.
  3. There are deep connections between the variance and concepts such as Information, precision (think CLT), Entropy etc. So it is a rather "well connected" quantity for the variability.