I was reading the book Algebra: Chapter 0 , by Paolo Aluffi, and came across the following assertion, in page 290, Exercise 5.9:
It is in fact much harder to factor integers than integers polynomials.
What I want to know is:
- What exactly is the meaning of easier.
- Why is that so? Because it seems quite unintuitive for me that finding a factorization of a high degree polynomial (with a lot of big integer coefficients) should be easier than factoring the degree of polynomial itself.
The precise meaning is that there are faster algorithms to factor a polynomial with $n$ coefficients each with $m$ bits than an integer with $n\cdot m$ bits. It is of course easier to factor the number $n$ itself--but this is not the correct comparison, because the number of bits required to express $n$ is much smaller than the number of bits required to express the polynomial.
For polynomials, there are algorithms which can factor the polynomial in an amount of time polynomial in $n\cdot m$. The strategy is essentially to recursively factor the polynomial over $\mathbb F_{p^n}$ for larger and larger $n$ and different values of $p$, and using this piece together a factorization over the integers. A more thorough discussion can be found on this Wikipedia page.
For integers, no algorithm is known that has runtime polynomial in $n\cdot m$. The best known algorithm is the horrendously complicated General Number Field Sieve.
One reason this might seem surprising is that it is much easier to write a correct integer factorization algorithm than a correct polynomial factorization algorithm. It turns out that if you replace "correct" with "fast", the opposite is true.