Suppose we have $N$ distinct real numbers $x_1, x_2, \ldots, x_N$ and consider the $N \times N$ matrix $$P = \left[\! \begin{array}{ccccc} 1 & 1 & 1 & \cdots & 1 \\ x_1 & x_2 & x_3 & \cdots & x_N \\ \frac{x_1^2}{2!} & \frac{x_2^2}{2!} & \frac{x_3^2}{2!} & \cdots & \frac{x_N^2}{2!} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \frac{x_1^{N-1}}{(N-1)!} & \frac{x_2^{N-1}}{(N-1)!} & \frac{x_3^{N-1}}{(N-1)!} & \cdots & \frac{x_N^{N-1}}{(N-1)!} \end{array}\!\right] \text{ where } P_{ij} = \frac{x_j^{i-1}}{(i-1)!}$$ I seem to have stumbled upon the following explicit formula for the entries of $Q = P^{-1}$ in terms of elementary symmetric polynomials in the $x_n$: $$ Q_{ij} = \frac{(j-1)!\ (-1)^{j-1} \sigma_{N-j}(x_{n \ne i})}{\prod_{k \ne i} (x_k - x_i)}$$ Here, $\sigma_k(x_{n \ne i})$ denotes the $k$th elementary symmetric polynomial in the $x_n$ with $x_i$ removed, and the product in the denominator is to be taken from $k=1$ to $N$.
Can anybody give a high-level explanation of why this formula works? The appearance of elementary symmetric polynomials is rather perplexing to me, and I'm not even sure how to establish that $\sum_{k=1}^N P_{ik} Q_{kj} = \delta_{ij}$.
The denominators correspond to left multiplication by the inverse of a diagonal matrix $D$ with diagonal entries $(0!,1!,2!,\ldots,(n-1)!)$, so I'll ignore the denominators for now. What is left is the transpose of a Vandermonde matrix$~V$, with general entry $V_{i,j}=x_i^{j-1}$. An obvious interpretation of $V$ is that of the linear map $\Bbb R[X]_{<n}\to\Bbb R^n$ that evaluates polynomials of degree${}<n$ in each of the points $x_1,\ldots,x_n$, with respect to the standard bases of $\Bbb R[X]_{<n}$ (the monomials) and of $\Bbb R^n$.
The inverse matrix $V^{-1}$ therefore is that the map $\Bbb R^n\to\Bbb R[X]_{<n}$ that interprets its argument $(v_1,\ldots,v_n)$ as the values at the points $x_1,\ldots,x_n$ of a polynomial function of degree${}<n$ and returns its coefficients. This is the problem of Lagrange interpolation, and column$~j$ of $V^{-1}$ therefore gives the coefficients of the Lagrange polynomial $\prod_{i\in\{1,\ldots,n\}\setminus\{j\}}\frac{X-x_i}{x_j-x_i}$ which takes the value $1$ in $x_j$ and $0$ in all the other $x_i$.
To compute your inverse matrix it remains to expand $\prod_{i\in\{1,\ldots,n\}\setminus\{j\}}(X-x_i)$ by Vieta's formulas, transpose, and multiply to the right by$~D$. The result is your formula.