Division theorem for polynomials with integer coefficients

3k Views Asked by At

I can see that the Division Theorem holds for polynomials in $\mathbb{Q}[x]$, but does not necessarily hold for polynomials in $\mathbb{Z}[x]$, e.g.

Let $f=x^2+3x$ and $g=5x+2$.

Then the Division Theorem yields unique polynomials $q$ and $r$:

$f=gq+r$, in essence,

$x^2+3x=(\frac{1}{5}x+\frac{13}{25})\cdot(5x+2)-\frac{26}{25}$

where $q=\frac{1}{5}x+\frac{13}{25}$ and $r=-\frac{26}{25}$, but $q,r \notin \mathbb{Z}[x]$.

The question is, how does one show that any $q,r$ fails to be in $\mathbb{Z}[x]$ to satisfy $f=gq+r$?

More concretely, for the existence of some $f$ and $g$ in $\mathbb{Z}[x]$, how does one show that there does not exist $q,r$ in $\mathbb{Z}[x]$ such that $f=gq+r$ with the property that $r=0$ or $\deg(r)<\deg(g)$?

2

There are 2 best solutions below

1
On

The Division Theorem is true for $\mathbb Q[x]$. If the unique $q, r \in \mathbb Q[x]$ such that $f = q g + r$ with $r = 0$ or $\deg r < \deg g$ are not in $\mathbb Z[x]$, there certainly can't be any others.

0
On

If the leading coefficient of $g$ does not divide the leading coefficient of $f$ (in $\mathbb Z$) then there are no $q,r\in\mathbb Z[X]$ such that $f=gq+r$ with $r=0$ or $\deg r<\deg g$.