Time complexity of Vieta's formulas

44 Views Asked by At

I need to find the time complexity for calculating the coefficients of the polynomial of degree $n$, if given it's roots. That is, if for $P(x)=(x-r_1)(x-r_2)...(x-r_n)=\sum_{i=0}^{n}a_nx^n$, values of $r_1, r_2,..., r_n$ are given, find $a_k$, for $k=0,1,...n$. I am familiar with Vieta's formulas, and I have managed to figure out a couple of things:

  • For a polynomial of degree n, there are n equations expressing the coefficients $a_{n-1}$ to $a_0$. Polynomial has to be monic, because Vieta's formulas don't offer the solution for $a_n$.
  • Each of the equations has $T=\binom{n}{k}$ terms, where $k=1...n$.
  • Each of the terms in the $k$-th equation is a product of k roots, requiring $k-1$ multiplications
  • For each of the equations, there has to be $T-1$ additions, in order to add all $T$ terms

When putting this together, I have came up with the following formula: $$f(n)=\sum_{k=1}^{n}\Bigg[\binom{n}{k}(k-1)+\binom{n}{k}-1\Bigg]=\sum_{k=1}^{n}\Bigg[k\binom{n}{k}-1\Bigg]=\Bigg[\sum_{k=1}^{n}k\binom{n}{k}\Bigg]-n$$

I have no idea is my reasoning correct, though I seem to think that it is not far off. I have no idea how to calculate the sum in the last equality. Is there a better way to arrive at this solution? It seems to me that when this is evaluated, complexity will be really high, which does not seem intuitive to me.

1

There are 1 best solutions below

4
On BEST ANSWER
  1. I think it would make more sense to count additions and multiplications separately since multiplications are likely to be more expensive than additions, especially since here some of the numbers you're multiplying might get large.

  2. $\sum k {n \choose k}$ can be evaluated to be $n \cdot 2^{n-1}$ in several ways (a cute one is to add it to itself but indexing the sum in reverse order, with $k$ replaced by $n-k$).

  3. However, that doesn't matter because it's actually possible to do much better than your proposed algorithm. The solution is suggested by thinking about the expression $(x - r_1) \dots (x - r_n)$ itself: we can just iteratively compute the coefficients of the polynomials $P_k(x) = \prod_{i=1}^k (x - r_i)$. At the $k^{th}$ step, multiplying by $x - r_{k+1}$ costs $k$ multiplications (multiplying the polynomial by $-r_{k+1}$) and then $k$ additions (adding the results to the polynomial multiplied by $x$, which just means shifting the coefficients). In total this means we only need ${n \choose 2}$ additions and ${n \choose 2}$ multiplications.

There is apparently some literature on efficient computation of the elementary symmetric polynomials and in that literature this is known as the "summation algorithm"; see, for example, this paper I found by googling.