What is the Computational Complexity of the Elementary Symmetric Polynomials

221 Views Asked by At

The elementary symmetric polynomials in $n$ variables, $e_k(X_1,\dots,X_n)$, are defined implicitly by $$(X-X_1)(X-X_2) \cdots (X-X_n)=\sum_{k=0}^n (-1)^k e_k(X_1,\dots,X_n) X^{n-k}, \quad 1 \leq k \leq n $$ and explicitly by $$e_k (X_1 , \ldots , X_n )=\sum_{1\le j_1 < j_2 < \cdots < j_k \le n} X_{j_1} \dotsm X_{j_k}, \quad 1 \leq k \leq n. $$

I'm interested in computing these polynomials efficiently, meaning with low computational complexity. The explicit formula suggests a direct implementation: form the $\binom{n}{k}$ products of $k$ factors each and add them all.

I'm a bit confused about what the complexity of $e_k(X_1,\dots,X_n)$ means exactly, as $k$ might depend on $n$. Fixing a particular $k$, e.g. $k=2$, we see that the computational complexity of $e_2(X_1,\dots, X_n)$ is $\mathcal{O}(n^2)$. The degree of the complexity class appears (to me) to grow with $k$, up to $k=\frac{n}{2}$, and then start falling. This naive statement suffers from the fact that $k$ depends on $n$ as I've said before, and I don't know much about complexity in two variables.

In any case, I'm interested in computationally quick methods of calculating the $e_k$'s.
Is this a well known problem?
Are there any provable tight bounds?

1

There are 1 best solutions below

1
On BEST ANSWER

It is possible to compute $e_0, e_1, ..., e_n$ in $\mathcal{O}(n^2)$, if I understand your question correctly. We could make a tree-like scheme using the following identity:

$$e_k(x_1,x_2,...,x_n) = x_n\cdot e_{k-1}(x_1,x_2,...,x_{n-1}) + e_k(x_1,x_2,...,x_{n-1}) $$

This recurring relationship contains two operations. Now let's introduce the following notation: $e^n_k = e_k(x_1,...,x_n)$ for simplification purposes. Thus we have the tree-like structure, where every 'parent' is connected to its two 'children' (according to the recurring relationship).

$$e_0^4\quad e_1^4\quad e^4_2\quad e^4_3\quad e^4_4$$ $$e_0^3\quad e_1^3\quad e^3_2\quad e^3_3$$ $$e_0^2\quad e_1^2\quad e^2_2$$ $$e_0^1\quad e_1^1$$ $$e_0^0$$

If we cut off 'cheap' operations, assigning $1$ to all $e_0$ and assign $x_1$ to $e_1^1$, then the recurring relationship simplifies for $e_1^n$ (notation $\sigma^n$) and $e_n^n$ (notation $\pi^n$). Now we have the following two relationships, both containing one operation: $\sigma^n=x_n+\sigma^{n-1}$ and $\pi^n=x_n\cdot\pi^{n-1}$.

$$1\quad \sigma^4\quad e^4_2\quad e^4_3\quad \pi^4$$ $$1\quad \sigma^3\quad e^3_2\quad \pi^3$$ $$1\quad \sigma^2\quad \pi^2$$ $$1\quad x_1$$ $$1$$

If we assign weights to the edges, either $0,1$ or $2$, according to the amount of operations used, then if we add up all of the weights, we get a function of $n$: complexity = $n(n-1)\implies\mathcal{O}(n^2)$. Hopefully this answer satisfies your question.

P.S. If you want to compute only one $e_k$, you could use the tree structure and count the specific (used) edges and you would have your answer. But surely it is bounded by $\mathcal{O}(n^2)$. I think it should look like: (not sure, so correct me if I'm wrong).

$$\mathcal{O}_{n,k} = \begin{cases} 0 & k=0\lor k>n\\ (2k-1)(n-k)+k-1 & k\leq \lceil\frac{n}{2}\rceil\\ \mathcal{O}_{n,n-k+1} & \text{otherwise}\\ \end{cases} $$