Proving that the function $f(x)=x^n+a_{n-1}x^{n-1}+...+a_1x+a_0$ with $a_1,a_2,...,a_n$ integers has a maximum GCD of $n!$ over all integral inputs $x.$ For example, I have thought of $x^2+x$ as proof since for each number it would be $x(x+1),$ one of which would be divisible by $2.$
I have thought of this, and it seems kind of obvious, but I can't think of a way to be true. I mainly have a problem with proving it cannot be greater.
I know equality occurs when it's a polynomial such as
$$x(x+1)(x+2)...(x+n-1)$$
since that is divisible by $n!$ always, but I can't prove that it is the greatest.
Is there an NT proof for this?
Proof is by induction on $n$ that the $GCD$ of $f(x)=\sum_{i=0}^n a_ix^i$ is at most $|a_n|n!$. If $n=1$, then it is clear that the numbers $a_1x+a_0$ and $a_1(x+1)+a_0$ have a GCD of at most $|a_1|$ (which is their difference). Suppose the claim is true for $n$ and let $f(x)=a_{n+1}x^{n+1}+\ldots +a_0$ be a polynomial with integer coefficients. Let $d$ be the GCD of all $f(x), x\in \mathbb{Z}$. We show that $d\leq |a_{n+1}|(n+1)!$. We must have $d|(f(x+1)-f(x))$ for all $x$ and so $d\leq GCD(g(x))$, where $g(x)=f(x+1)-f(x)$. Now $$g(x)=f(x+1)-f(x)=a_{n+1}(x+1)^{n+1}-a_{n+1}x^{n+1} +a_n((x+1)^n-x^n)+\ldots +a_1=a_{n+1}(n+1)x^n+q(x),$$ where $q(x)$ is of degree less than $n$. By the inductive hypothesis, we must have $d \leq GCD(g(x))\leq |a_{n+1}(n+1)|n!=|a_{n+1}|(n+1)!$.