Is a proof required for the Division Algorithm for polynomials?

1k Views Asked by At

I understand that the Division Algorithm can be applied to polynomials. Namely, for polynomials, for any polynomials $f,g$, there exist polynomials $q,r$ such that $f = gq + r$, with $\deg(f) > \deg (g).$ What does a proof of this look like? Is it an assumption we can make that you can always divide two polynomials with a quotient and remainder? I remember we used a proof using the Well-Ordering principle to prove the Division Algorithm for two integers, but how can this be done for polynomials? Is it a valid assumption to say that the quotient and remainder will exist?

For the record, I don't mean this with regards to $\gcd (f,g)$.

2

There are 2 best solutions below

3
On BEST ANSWER

Basically, what you need is to be able to divide any coefficient by the leading coefficient of the divisor.

Explicitly, let $f(x) = a_nx^n+\cdots + a_1x + a_0$ be a polynomial with coefficients in some (commutative) ring, and assume that $a_n$ is a unit; that essentially means that you can “divide by $a_n$”. For example, for polynomials with integer coefficients, you would need the leading coefficient to be either $1$ or $-1$.

We prove that for every polynomial $g(x)$ (with coefficients in the same ring) there exist polynomials $q(x)$ and $r(x)$ such that $g(x) = q(x)f(x) + r(x)$, with $r(x)=0$ or $\deg(r)\lt \deg(f)$. The polynomial $r(x)$ is the “remainder”, and the polynomial $q(x)$ is the “quotient”. We prove this by induction on $\deg(g)$.

If $g=0$, then we can take $q(x)=r(x) = 0$. Assume the result holds for any polynomial of degree strictly smaller than $k$, and that $\deg(g)=k$. Write $$g(x) = b_kx^k+\cdots + b_1x + b_0.$$ Think long division. If $k\lt n=\deg(f)$, we let $q(x)=0$ and $r(x)=g(x)$.

If $k\geq n$, then think Long Division. Look at $q_1(x)=\frac{b_k}{a_n}x^{k-n}$. This makes sense, because we said we can divide by $a_n$, and because $k-n\geq 0$. Note that the leading term of $q_1(x)f(x)$ is exactly $b_kx^k$. Therefore, $g(x) - q_1(x)f(x)$ is either $0$ or of degree strictly smaller than $k$. Either way, by induction we know that there exist a polynomial $q_2(x)$ and a remainder $r(x)$ with $r(x)=0$ or $\deg(r)\lt\deg(f)$, and $$g(x)-q_1(x)f(x) = q_2(x)f(x) + r(x).$$ But then we have $$g(x) = q_1(x)f(x)+q_2(x)f(x) + r(x) = (q_1(x)+q_2(x))f(x)+r(x),$$ with $r(x)=0$ or $\deg(r)\lt\deg(f)$. Letting $q(x)=q_1(x)+q_2(x)$ we are done.

Note that the Well-ordering principle for the nonnegative integers is essentially equivalent to induction.

Once you have this it is straightforward to go further and prove uniqueness, again under the assumption that you can always divide by $a_n$.

When the coefficients are in a field (like $\mathbb{Q}$, or $\mathbb{R}$, or $\mathbb{C}$) this always holds, which means you can always do division with remainder for one polynomial divided by a nonzero polynomial.

4
On

The Euclidean algorithm can be proven to work in vast generality. The key part here is that you can use the fact that naturals are well ordered by looking at the degree of your remainder. The remainder's degree always strictly decreases, and so your process must terminate after finitely many steps, since each term you get a remainder with smaller and smaller degree and so eventually the remainder is constant and you're done.

Generally, Euclid's algorithm works as long as you can associate a positive integer called the norm to each nonzero element of your number system and perform some 'division' that reduces this norm.