Polynomial Time Root Extraction

436 Views Asked by At

Given a consistent system of polynomial equations:

$A_1(x_1, x_2, x_3 ... x_n) = 0$ $A_2(x_1, x_2, x_3 ... x_n) = 0$

etc...

$A_n(x_1, x_2, x_3 ... x_n) = 0$

If we let $d_1, d_2, d_3... d_n$ be the degrees of the polynomials $A_1, A_2, A_3, ... A_n$ respectively then the number of solutions is equal to:

$d_1*d_2*d_3...d_n$ and therefore grows exponentially as the dimension of the problem scales linearly therefore putting any solver of systems of polynomials as a guaranteed EXPTIME algorithm.

However, this still does not answer a question:

If the number of roots grows at an exponential rate $O(o^d)$ where o is the order of the polynomial and d is the dimension of the system. How does the time to calculate the roots measure up to that?

For example a "polynomial-time" algorithm may exist in the sense that even though it must do exponential work to calculate the exponentially many roots, each root takes time polynomial in the order and dimension of the system to calculate:

ex: if an individual root took time $O(o^3 + d^3)$ or some other arbitrary polynomial to calculate the general problem remains in EXPTIME however locating individual solutions becomes a polynomial time affair.

Is finding a single solution the a system of polynomials doable in polytime? Or is that too EXPTIME since the general problem is considered NP HARD.