Say I have a stream of values arriving all the time, and I want to get the average and standard deviation of only the last $n$ values.
If I already have the average $V$ for values $v_1, ..., v_n$, then when $v_{n+1}$ arrives, all I have to do to find the new average $V'$ for values $v_2, ..., v_{n+1}$ is:
$$ V' = V - \frac{v_1}{n} + \frac{v_{n+1}}{n} $$
However, how can I do this for the standard deviation? Is there any such shortcut?
EDIT:
The purpose of this question is to minimize the computation necessary for a software I'm developing. Part of it relies on estimating the communication latency of other processes (but only for the most recent messages received).
Hint
By definition $$\sigma=\sqrt{\frac{\sum _{i=1}^n (v_i-\mu )^2}{n}}$$ that is to say that $$n \sigma^2=\sum _{i=1}^n (v_i-\mu )^2=\sum _{i=1}^n v_i^2-2 \mu \sum _{i=1}^n v_i+n \mu^2$$ Now, do the same as you did for the average.
I am sure that you can take from here.