I recently had an issue where a function in a programming language was taking the mean of $n$ numbers, $x_1, \ldots, x_n$ as:
$$ M_1 = \dfrac{x_4+\dfrac{x_3+\dfrac{x_1+x_2}{2}}{3}}{4}\ldots $$
instead of the usual:
$$ M_2 = \dfrac{1}{n}\sum_{i=1}^nx_i $$
I am wondering if there is a name for such a summation and if there is a theoretical result for how wide the deviation $M_1$ will be from the usual $M_2$? Specifically I had the case that $x_1, \ldots, x_n \in [0,1]$, but am curious about the case where $x_1,\ldots, x_n \in \mathbb{R}$ as well. I am wondering how far it might differ and in what cases results might be similar? Thanks!
$$M_1(x_1, \ldots , x_n) = \sum_{k=1}^{n} \frac{(k-1)!}{n!} x_{k}$$
The sum of “weightings”: \begin{align} u_n &= \frac{1}{n!}\sum_{k=1}^{n} (k-1)! \\ (n+1) \, u_{n+1} &= 1+u_{n} \\ (n+2) \, u_{n+2} &= 1+u_{n+1} \\ u_{n+2} &= u_{n+1}-\frac{u_{n}}{n+2} \\ u_1 &= 1 \\ u_2 &= 1 \\ u_3 &= \frac{2}{3} \\ u_4 &= \frac{5}{12} \end{align}
For $n > 2$, $$0<u_n<1$$