If I have a function that calculates the mean value of a set of elements that is an arbitrary subset of some set $X$, does the mean, by definition, have to also be in $X$? (In other words, if the mean can't be in $X$, does that imply that "mean" cannot be defined for $X$?) If not, is there a counter example?
Example: If I can calculate the mean of a set of floobles, does the mean value also need to be a flooble for me to be able to claim that I can actually calculate the mean?
Edit - Possibly More Concrete Example: If I claimed I could calculate the average value of a set of words, but I defined that to mean "the average length of a word in the set", could you then argue that since the average value was not a word, then it is not actually an "average" value of those words?
Sorry if that was confusing, I am having trouble expressing this in words.
There is no universal definition of average. Where averages are usually mentioned you're working with at least the real numbers. There's even a sort-of average taken in group theory, forgot what it was.... ah wait, here we go:
Burnside's lemma
So when something is labeled with the term average there is usually a "summation" operator, although this maybe some non-everyday arithmetic operator like set union for instance, and a final normalizing factor that you scale the sum by.