Suppose we have an ordered set of numbers, for example:
$\{0,2,3,5,9,10\}$
Now we take the mean of the first two numbers:
$(0+2)/2=1$
And then we take the mean of this number and the third number of our ordered set.
$(1+3)/2=2$
And so on until we did this with the last number of our ordered set.
Has this procedure a name?
This usually goes by the name "exponential smoothing". Say you have a sequence $\langle a_k \rangle$, you smoothen the sequence selecting $0 \le \alpha \le 1$ and:
$\begin{align*} A_0 &= \text{more or less arbitrary value} \\ A_{k + 1} &= \alpha a_k + (1 - \alpha) A_k \end{align*}$
The idea being that the $A_k$ represent a less bumpy representation of $a_k$, incorporating some of the historic evolution. If $\alpha = 0$, only history matters, if $\alpha = 1$ only the last value is considered. Intermediate values give intermediate weight to history.
This is quite popular for example in operating systems, which handle lots of variables that require monitoring, where just the last value can be very misleading, and the resources required to do a decent time series analysis just aren't available (or the analysis warranted). It takes up very little memory (one $A$ value) and negligible processing (if you select $\alpha$ carefully, you don't even need to multiply!).