The arithmetic mean of two numbers makes sense, it is just the "middle". The arithmetic mean of a set of data points $a_i$ makes sense [sort of] and is called the average (or sometimes mean). What about the geometric mean?
I tried to make sense of it but could find nothing. I convinced myself in the case of geometric mean of two numbers as follows:
It is that number which when inserted between two numbers, together forms a geometric progression [and the geometric progression is quite a useful thing].
But, what about the geometric mean of a set of points $a_i$? What does the number $(a_1 . a_2 \ldots a_n)^{1/n}$ mean for the set of numbers $a_i$ ?
One observation is that the geometric mean is the same as the arithmetic mean if we take the logarithms of the values.
$$LOG(GM(X_i)) = AM(LOG(X_I))$$
The geometric mean is more natural/appropiate when it's more natural/appropiate to multiply the values (hence, sum the logarithms) instead of summing the values. (Also , the values should be inherently positive). (See for example here)
For example, if the inflation rate of some year was $R_i$, that means the prices increased by a multiplicative factor $F_i = 1+R_i$. We see that the accumulated effect of several years is obtained by multiplying these (inherently positive... even when we have deflation) factors; it makes total sense to multiply them, it makes little or no sense to sum them. Then, to compute an average increase for a sequence of years, one should compute $\overline F=(\prod Fi)^{1/n}$ (and from that the "average inflation rate" as $\overline R=\overline F -1$). That average has a concrete meaning: if all the years we'd have the same constant price increase $\overline F$, then the total increase would have been the same.