Given, $a,b,c,d,e$ are real numbers such that $2a^2<5b$. Show that $x^5+ax^4+bx^3+cx^2+dx+e$ cannot have all real roots.

93 Views Asked by At

Given, $a,b,c,d,e$ are real numbers. $2a^2<5b$. Show that $$f(x)=x^5+ax^4+bx^3+cx^2+dx+e$$ cannot have all real roots.

2

There are 2 best solutions below

9
On

The claim is false. $x^4-(1/4)x^3=0$ with $2a^2=2, 5b=-1.25$ has all real roots.

1
On

I think this [the originally posted problem statement with $f(x)=ax^4+bx^3+cx^2+dx+e$] is an incorrect copying of the problem with a monic polynomial written as $x^n + ax^{n-1} + bx^{n-2} + ...$ (for $n=5$). There is no inequality or any other condition on the two highest degree coefficients that could limit the number of real roots but there are such inequalities on $a$ and $b$.

The question for degree $n$ monic polynomials is, what linear inequalities have to hold between $a^2 = (\sum a_i)^2$ and $b = \sum a_i a_j$ for real numbers $a_1, a_2, \dots a_n$.

All roads, such as Cauchy-Schwarz or expanding $\sum (a_i - a_j)^2$, seem to lead to the same inequality $(n-1) a^2 \geq 2n b$. I suspect that all other inequalities on $(a,b)$ can be deduced from that.

For $n=4$ the condition for real roots is $3a^2 \geq 8b$.

For $n=5$ the condition for real roots is $2a^2 \geq 5b$.

Therefore, I guess that your problem was actually to

show that $x^5 + ax^4 + bx^3 + cx^2 + dx + e$ cannot have all real roots if $2a^2 < 5b$.

Knowing what the inequality should be for degree $n$ monic polynomials, it can also be proved by induction on $n$ , differentiating the polynomial and using Rolle's theorem, that it is a condition for the polynomial to have at most $n-2$ real roots.