Algorithm(s) for computing an elementary symmetric polynomial

5.1k Views Asked by At

I've run into an application where I need to compute a bunch of elementary symmetric polynomials. It is trivial to compute a sum or product of quantities, of course, so my concern is with computing the "other" symmetric polynomials.

For instance (I use here the notation $\sigma_n^k$ for the $k$-th symmetric polynomial in $n$ variables), the Vieta formulae allow me to compute a bunch of symmetric polynomials all at once like so:

$$\begin{align*} &(x+t)(x+u)(x+v)(x+w)\\ &\qquad =x^4+\sigma_4^1(t,u,v,w)x^3+\sigma_4^2(t,u,v,w)x^2+\sigma_4^3(t,u,v,w)x+\sigma_4^4(t,u,v,w) \end{align*}$$

and, as I have said, $\sigma_4^1$ and $\sigma_4^4$ are trivial to compute on their own without having to resort to Vieta.

But what if I want to compute $\sigma_4^3$ only without having to compute all the other symmetric polynomials? More generally, my application involves a large-ish number of arguments, and I want to be able to compute "isolated" symmetric polynomials without having to compute all of them.

Thus, I'm looking for an algorithm for computing $\sigma_n^k$ given only $k$ and the arguments themselves, without computing the other symmetric polynomials. Are there any, or can I not do better than Vieta?

2

There are 2 best solutions below

2
On

Let us use the symbols $u_1, u_2, ....$, for the indeterminates $t, u, v, ...$ in the question. The computation will be given in terms of a new set of indeterminates, $x_1, x_2, ....$, whose connection to the original indeterminates is given by:

$x_j = \sum_{i=1}^{n} u_i^j$

We define the derivation operator $\Delta$ acting on the new set of indeterminates as follows:

$\Delta x_j = j x_{j+1}$

$\Delta ab = a \Delta b + b \Delta a$

Then the $i$-th elementary symmetric polynomial is given by:

$\sigma_n^i = \frac{1}{i!}(x_1-\Delta)^{i-1}x_1$

The evaluation is performed in terms of the new indeterminates, after the evaluation, the expressions of the new determinates in terms of the original indeterminates need to be substituted.

3
On

You can use Newton-Girard formulae. The elementary symmetric polynomial have representation as determinants: $$ p_k(x_1,\ldots,x_n)=\sum_{i=1}^nx_i^k = x_1^k+\cdots+x_n^k \\ e_k(x_1,\ldots,x_n)=\sum_{1 \leq i_1<i_2<...<i_k\leq n}x_{i_1}x_{i_2}\cdots x_{i_k} $$ $$ e_k=\frac1{k!} \begin{vmatrix}p_1 & 1 & 0 & \cdots\\ p_2 & p_1 & 2 & 0 & \cdots \\ \vdots&& \ddots & \ddots \\ p_{k-1} & p_{k-2} & \cdots & p_1 & k-1 \\ p_k & p_{k-1} & \cdots & p_2 & p_1 \end{vmatrix} $$ We can compute this determinant using $O(k^2)$ additions and multiplications, using the following general result (from citation at end):

Let $A_n$ be an $n\times n$ lower Hessenberg matrix, meaning that $a_{ij}=0$ whenever $j\ge i+2$. For each $k\in \{1,\dots,n\}$, let $A_k$ be the $k\times k$ submatrix consisting of the upper $k$ rows and leftmost $k$ columns of $A_n$. Then for all $n\ge 1$,

$$\det A_n=a_{n,n}\det A_{n-1}+\sum_{r=1}^{n-1}(-1)^{n-r}a_{n,r}\prod_{j=r}^{n-1}a_{j,j+1}\det A_{r-1}$$ with the convention that $\det A_0=1$.

Since it takes $O(nk)$ to compute the list $p_1(x_1,\dots,x_n),\dots,p_k(x_1,\dots,x_n)$, the total complexity of this method is $O(nk+k^2)=O(nk)$ additions and multiplications.

Nathan D. Cahill, John R. D'Errico, Darren A. Narayan & Jack Y. Narayan (2002) Fibonacci Determinants, The College Mathematics Journal, 33:3, 221-225, DOI: 10.1080/07468342.2002.11921945