When are geometric and harmonic means used?

443 Views Asked by At

From what little statistics I know, the only 'mean' commonly used is the arithmetic mean, and the rest are irrelevant. Any reading I've done has pretty much said something along the lines of "acceleration or something".

So, under what situations are geometric, harmonic, and the other types of means are genuinely useful, and why are they used for those situations?

2

There are 2 best solutions below

1
On

From the Wikipeida article Root mean square are the quotes

In statistics and its applications, the root mean square (abbreviated RMS or rms) is defined as the square root of the mean square (the arithmetic mean of the squares of a set of numbers

and

In estimation theory, the root mean square error of an estimator is a measure of the imperfection of the fit of the estimator to the data.

and

Physical scientists often use the term "root mean square" as a synonym for standard deviation when it can be assumed the input signal has zero mean

In the case of the geometric mean, if you have some cubic containers and find the geometric mean of their volumes, then that is the cube of the geometric mean of their side lengths. The Wikipeida article Geometric mean mentions proportional growth and other examples.

In the case of harmonic mean, if you have some cars with data on their fuel efficienty, distance per amount of fuel (MPG), then a better average measure of their fuel efficiency is the harmonic mean because what matter is the amount of fuel to travel a fixed distance. This is mentioned in the Wikipeida article on Harmonic mean.

A classic example of the use of harmonic and geometric means was when Archimedes bounded the value of $\, \pi \,$ by finding the perimeters of inscribed and circumscribed regular polygons of a diameter one circle. When the number of sides of the polygons are doubled the new perimeters are harmonic and geometric means of the pervious perimeters.

0
On

The root-mean square of $x_1,\ldots, x_n$ is $\sqrt{\dfrac{x_1^2+\cdots+x_n^2} n \,}.$

$\newcommand{\v}{\operatorname{var}}$The standard deviation is the root-mean-square deviation. That's one example of a sort of mean other than the arithmetic mean being used. One can wonder why one uses the S.D. as a measure of dispersion rather than the arithmetic mean of the absolute deviations. The reason is that $\v(X_1+\cdots+X_n)$ $= \v(X_1)+\cdots+\v(X_n)$ if $X_1,\ldots,X_n$ are independent random variables, and you need that every time you use the central limit theorem.

Another place where root-mean-square is used is in electrical engineering. When the voltage alternates according to a sine wave, then then the nominal voltage is the root-mean-square voltage. That way, the formula $$\text{volts}\times\text{amps} = \text{watts}$$ that applies to direct current also applies to alternating current.