I occasionally come across an article or study that gives an average in the form of a range (8.1 - 8.7). For example, literature covering shoe sizes might state the average shoe size for adult males to be 9-12.
But if I take the mean of {7.7, 7.8, 8.5, 10.2, 11.5}, I naturally get 9.14, not some range of numbers.
Is the use of a range ever valid for an average, or is this incorrect and an average should always be one number?
I believe part of the problem is that the word average is used for many things: it typically refers to the arithmetic mean but it could also refer to the mode, median, geometric mean or harmonic mean. These notions of average are all trying to capture some notion of "central tendency" or "most likely value". A range could be considered to be part of this tradition.
However, I suspect that the range was derived from a sample and is reporting some type of statistical interval without being precise. Let's use your example of shoes sizes for adult males. There is no way for someone to have computed the arithmetic mean of all shoes sizes for men (the arithmetic mean of the population). So, instead they probably computed a mean of a random sample of adult men shoes sizes. However, this estimate (i.e., the sample average) depends on the sample and hopefully is representative of the entire population. Statisticians have derived intervals (ranges) which communicate the uncertainty in their estimation procedure. There is some fine print which goes along with the intervals (ranges) but like a lot of fine print it doesn't always make it into the final version of the communication.