I'm writing some code which calculates some averages. Obviously, the traditional way to calculate any average is to add up all the values, and then divide by the number of values.
However, in the mechanism I'm working on, I find it much easier to add and calculate the averages one at a time, as in add a new value to the averaged total, and then divide by two each time (since each time there are two numbers being added). But I'm not sure how accurate it would be.
Can I calculate averages this way? Or is it not reliable?
NOTE: I began writing this question originally, and while coming up with an example, found my answer. So I added an answer with my question at the same time.
(I'm answering my question Q/A style)
Imagine this set of numbers:
The traditional method is:
What you are proposing is:
So no, this proposed method of calculating an average does not work.
Not to mention, even if it did work, it would be much slower anyway.
What you could do instead is to continue adding the values together, and elsewhere keep track of the number of values. Each time you add a value, also increment the number of values added. Then, at any given point, you are able to calculate the average...
In any case, the moral of the story is that you still need to add all the values together, and then divide by the number of values. You can keep a sum of all values, but you also need to keep a count of the values which have been added to that sum.