Here is one of my homework in the computer algebra class:
Let $f(x),g(x)\in \mathbb{Z}[x]$ be of positive degree $m,n$, and $f(\alpha)=g(\beta)=0$, where $\alpha\neq \beta$ are real. Show that $$ |\alpha-\beta|>\frac{1}{2^{(n+1)(m+1)}||g||_1^{m}||f||_1^n}$$
Here is my solution:
Notice that $(\alpha-\beta,\beta)$ is a common root of $f(x+y)$ and $g(y)$. Hence $\alpha-\beta$ is a root of $$ R(x):=\mathrm{Res}_y(f(x+y),g(y))=\det \big(\mathrm{Syl}_y(f(x+y),g(y))\big)$$
Now we can view $\mathrm{Syl}_y(f(x+y),g(y))$ as a matrix in $(\mathbb{Z}[x])^{(m+n),(m+n)}$. By Hadamard inequality, for $1$-norm on $\mathbb{Z}[x]$ (By 1-norm I mean $||\sum a_i x^i||_1:=\sum |a_i|$), we have: $$\begin{aligned} &||R||_1\\ &\leqslant \big(||a_m||_1+||\binom{m}{1}a_mx+a_{m-1}||_1+||\binom{m}{2}a_mx^2+\binom{m-1}{1}a_{m-1}x+a_{m-2}||_1+\dots \big)^n\cdot ||g||_1^{m}\\ &= \big(|a_m|2^m+|a_{m-1}| 2^{m-1}+\dots+|a_0|)^n\cdot ||g||_1^{m}\\ &\leqslant 2^{mn}||f||_1^n||g||_1^m \end{aligned}$$
Since $\alpha-\beta$ is a non-zero root of $R(x)$, we have: $$ \begin{aligned} |\alpha-\beta|&>\frac{1}{1+||R||_\infty}\geqslant \frac{1}{1+||R||_1}\geqslant \frac{1}{2^{mn}||f||_1^n||g||_1^m+1}\\ &>\frac{1}{2^{mn}||f||_1^n||g||_1^m\cdot2^{1+m+n}||f||_1^n||g||_1^m}=\frac{1}{2^{(n+1)(m+1)}||g||_1^{m}||f||_1^n} \end{aligned}$$
I find my proof strange, since I do some redundant and huge scaling in the last step. Do I miss or misuse something?
Any comments or correction are highly appreciated. THANKS!
Edit: I do some correction, now it seems more reasonable.