The error in Lagrange interpolation is $$ E(x)\equiv f(x)-L_n(x)=\frac{f^{(n+1)}(\xi_x)}{(n+1)!}\prod_{j=0}^n(x-x_j) $$ where $f\in C^{(n+1)}([a,b])$ is the function to be inter-/extrapolated, $\{x_0,\dots,x_n\}$ is the set of distinct nodes in $[a,b]$ and $L_n$ is the Lagrange polynomial (of degree at most $n$). The standard proof uses Rolle's theorem between adjacent nodes, recursively $n+1$ times, and concludes $\xi_x\in[a,b]$ must exist.
Question: the Rolle logic implies to me that $\xi_x$ should be somewhere in the middle of the range defined by the nodes and $x$ and that we could use the following approximate value for $\xi_x$: $$ \xi_x\approx \frac{x+x_0+\cdots+x_n}{n+2} $$ Do we know anything about such refinements of the error term?
(In my concrete problem $f(x)=e^{-x}$ and some further calculations could benefit from this approximation, if at all valid.)
I am answering my own question, maybe others will find it useful in the future. I don't think this result is known elsewhere as it is very specific and I just figured it out with a day trial-and-error with Maple.
So consider this formula for start $$ L_{m-1}(0)=\sum_{k=1}^m e^{-x_k}\prod_{j=1(\ne k)}^m \frac{x_j}{x_j-x_k},\quad m=1,2,\dots. $$ The product on the RHS is the base polynomial $l_k(x)$ of the Lagrange interpolation evaluated at $x=0$. The whole sum is therefore the Lagrange interpolation polynomial (of degree $m-1$) for the function $f(x)=e^{-x}$ evaluated at $x=0$, hence the notation $L_{m-1}(0)$. In my concrete case the nodes $x_k$ are all positive, but in what follows they can have any value as long as they are all different.
Using the error formula we get $$ f^{(m)}(\xi_0)=\frac{m!(f(0)-L_{m-1}(0))}{(-1)^m\prod_{j=1}^m x_j} $$ where $\xi_0$ is some value whose existence is guaranteed by Rolle's theorem. Now, since $f$ is just the exponential function all derivatives exist and are continuous and we can have an explicit formula for $\xi_0$. Putting everything together $$ \xi_0=-\log\left[ \frac{m!}{\prod_{j=1}^m x_j} \left(1-\sum_{k=1}^m e^{-x_k}\prod_{j=1(\ne k)}^m \frac{x_j}{x_j-x_k} \right) \right] $$ The main result what I "derived" with Maple is that $\xi_0$ has the following asymptotics when all $x_k\to0$: $$ \xi_0\sim\frac{1}{m+1}\sum_{k=1}^m x_k -\frac{m}{2(m+1)^2(m+2)} \sum_{k=1}^m x_k^2 +\frac{1}{(m+1)^2(m+2)} \sum_{k=1}^m\sum_{j=k+1}^m x_k x_j +\mathcal{O}(x^3) $$ The leading order (linear) term is just what I suspected in my question above. The second order diagonal and off-diagonal terms are new to me too but can be derived from Maple.
In my concrete problem I want to calculate $L_{m-1}(0)$ using some simpler expression. If we now cut off $\xi_0$ at second order and plug it back into the error formula and using $f(0)=1$ we have the approximation $$ L_{m-1}(0)\approx 1-\frac{e^{-\xi_0^{(2)}}}{m!}\prod_{j=1}^m x_j $$