linear combination of two coprime polynomials such that the roots are disctinct?

320 Views Asked by At

Let $a = (a_0, \dots, a_{n-1})^T \in \mathbb R^n$ and $b = (b_0, \dots, b_{n-1})^T \in \mathbb R^n$. We define two polynomials by \begin{align*} f_a(x) &= x^n + a_{n-1} x^{n-1} + \cdots + a_0 \\ f_b(x) &= b_{n-1} x^{n-1} + \cdots +b_0. \end{align*}

Suppose we require $f_a$ and $f_b$ are coprime in $\mathbb C$. Can we find a scalar $s \in \mathbb R$ such that $f_a(x) + s f_b(x)$ has distinct roots in $\mathbb C$?

If $f_a(x) + s f_b(x) = x^n + (a_{n-1} + s b_{n-1})x^{n-1} + \cdots + (a_0 + sb_0)$ has multiple roots for all $s \in \mathbb R$, this would imply the discriminant which is a polynomial in $s$ would be identically $0$ for $s \in \mathbb R$. I don't think this can happen but could not figure out how to argue this part.


To not cause confusion, this sentence is crossed out. This should be a separate question. The coprime condition just conveniently comes out of my situation. I am not sure this is necessary. Intuitively, I would think $b \neq 0$ should enough.

2

There are 2 best solutions below

1
On BEST ANSWER

Suppose that $f(x)+sg(x)$ has a repeated root for all $s$. Then for each $s$ there exists $x_s$ such that $$f(x_s)+sg(x_s)=f'(x_s)+sg'(x_s)=0.$$ Therefore, $f(x_s)g'(x_s)-f'(x_s)g(x_s)=0$ for all $s$. Let $h(x)=f(x)g'(x)-f'(x)g(x)$ and let $A$ be the finite set of roots of $h(x)$. $A$ is finite because otherwise $h(x)=0$ which implies that $f(x)g'(x)=f'(x)g(x)$. Since $f(x)$ and $g(x)$ are coprime, this in turn implies that $f(x)\mid f'(x)$ which is not possible.

It follows that $x_s \in A$ for all $s$ and so there exists $x_0$ such that $x_s=x_0$ for infinitely many $s$. One has $f(x_0)+sg(x_0)=0$ for infinitely many values of $s$. This clearly happens only if $f(x_0)=g(x_0)=0$ contradicting the assumption that $f(x),g(x)$ are coprime.

0
On

Even without coprimality this seems to be true. In fact consider the rational function given by $$ r=\frac fg\colon\Bbb P^1(\Bbb C)\to\Bbb P^1(\Bbb C)\;. $$ The equation $$ r(z)=a $$ has simple solutions if and only if $a$ is NOT a branch value, that is, if $z_0$ is such that $r(z_0)=a$, then $r'(z_0)\neq0$ (that is, $z_0$ is not a branch point).

Now $r'=0$ has only a finite number of solutions $S=\{z_1,\dots,z_m\}\subset\Bbb P^1(\Bbb C)$ and the branch values of $r$ are thus $r(S)$ which is clearly a finite set. If $a\notin r(S)$ then $r(z)=a$ has only simple solutions.