Calculate average using average value

123 Views Asked by At

I hope the title makes a bit of sense, as I don't know how to phrase it.

Basically, I'm writing a script which checks the response time of a server in milliseconds.

But instead of having thousands of values, I'd like to just stick to a single average value.

What math would I need to use for using an average value to calculate a new average value?

Ie: 9 + 4 + 7 = 20 / 3 = 6.666667 <-- correct average 9 + 4 = 13 / 2 = 6.5 + 7 = 13.5 / 2 = 6.75 <-- incorrect average

What would I need to change in that 2nd calculation in order to keep getting the correct average?

1

There are 1 best solutions below

0
On BEST ANSWER

You can't do this without a few more variables. You'll need to know $N$, the number of data points. Then you do what you did:

$$Avg_{new} = \frac{(Avg_{old}*N + Data_{new})}{N+1}$$

And be sure to update $N = N+1$ afterwards.