Let $f(t)\in 1 +t\mathbb{Z}[t]$ be an irreducible polynomial (not necessarily monic) of degree $\ge 2$. Thus $f(t)$ is also irreducible in $\mathbb{Q}[t]$.
Is it possible that there exist arbitrarily large integers $n\ge 0$ satisfying:
"There exist an $a\in\mathbb{Z}$ (possibly 0) such that $at^n + f(t)$ splits completely into linear factors in $\mathbb{Z}[t]$" ?
A second, probably easier question - Can we find a polynomial $f(t) \in 1+t\mathbb{Z}[t]$ such that there do not exist arbitrary large integers $n\ge 0$ satisfying the quoted condition?
Equivalently (by the substitution $g(t)\leftrightarrow g(1/t)t^{\deg g}$):
Answer: No, there can only be finitely many such $n$.
Proof. Clearly, $a_n$ must be non-zero as otherwise $f$ itself would split.
As $\deg f>1$, the derivative $f'$ has only finitely many zeroes. Let $$R=\max\{\,|x|:f(x)=0\lor f'(x)=0\,\}.$$ Then on each of the intervals $[R,\infty)$, $(-\infty,R]$, $f$ has constant sign and is monotonic. Therefore $t^nf(t)$ and ultimately $g_n(t)$ is also monotonic on each of these two intervals.
If $g_n$ splits completely into linear factors, it must have (counted with multiplicity) $n+d$ integer roots. Any root of multiplicity $k>1$ is a root of the derivative $g_n'(t)=t^{n-1}(f'(t)t+nf(t))$ with multiplicity $k-1$. Note that $g_n(0)=a_n\ne 0$, so that after removing any common factor of $g_n$ with the degree $d$ polynomial $f'(t)t+nf(t)$, we are still left with at least $n$ distinct integer roots of $g_n$. By monotonicity, $g_n$ has at most one root in $[R,\infty)$ and at most one root in $(-\infty,R]$. Thus $g_n$ must have $n-2$ distict integer roots in the interval $[-R,R]$, which is absurd if $n>2R+3$.