Does this assertion really hold?
My math teacher said so.
I cannot give a counterexample so far but I doubt the assertion.
Does this assertion really hold?
My math teacher said so.
I cannot give a counterexample so far but I doubt the assertion.
On
Well, consider $f(x) = ax^2+bx+c$ with integer coefficients $a,b,c$. The zeros are $$x_{1,2} = \frac{-b \pm\sqrt{b^2-4ac}}{2a}.$$ If the discriminant $D=b^2-4ac$ is an nonnegative integer which is a square, i.e., $D=d^2$ for some nonnegative integer $d$, then $$x_{1,2} = \frac{-b \pm d}{2a}.$$ The zeros are integral if $2a$ divides $-b\pm d$.
On
Well, consider $f(x) = ax^2+bx+c$ with integer coefficients $a,b,c$. The zeros are $$x_{1,2} = \frac{-b \pm\sqrt{b^2-4ac}}{2a}.$$ If the discriminant $D=b^2-4ac$ is an nonnegative integer which is a square, i.e., $D=d^2$ for some integer $d$, then $$x_{1,2} = \frac{-b \pm d}{2a}.$$ The zeros are integral if $2a$ divides $-b\pm d$.
How about $x^{2}+1$? This satisfies the hypothesis, doesn't it? You cannot factor this even in $\mathbb R$