If $x^2-bx+c=0$ has real roots, then prove that both are greater than $1$ when $c+1>b>2$.

355 Views Asked by At

If $ x^2-bx+c=0$ has real roots, prove that both roots are greater than $1$, when $c+1>b>2$.

Working

I tried to prove the given inequality by taking roots greater than $1$.

Let $\alpha$, $\beta$ be the roots of the quadratic equation. So $$\alpha+\beta =b$$ $$\alpha\cdot\beta =c$$

Since $\alpha>1$ and $\beta>1$, it can be deduced that, $$\alpha+\beta >2\implies b>2$$ $$\alpha\cdot\beta >1\implies c>1\implies c+1>2$$ to combine these two inequalities I need another link between $c$ and $b$. How to proceed? Thanks.

5

There are 5 best solutions below

1
On BEST ANSWER

Given $x^2-bx+c=0$ we need

  • $\Delta =b^2-4c\ge0$

and

  • $x_1=\frac{b-\sqrt{b^2-4c}}{2}>1 \iff b-2> \sqrt{b^2-4c} \stackrel{b>2}\iff b^2-4b+4\ge b^2-4c \iff 4c+4\ge4b \iff c+1\ge b$
1
On

If $x^2-bx+c$ has both roots larger than $1$, replacing $x$ by $1$ should give the same sign as very large negative $x$, i.e. positive. Therefore we need $1-b+c>0$, which can be rearranged into the second inequality.

On the other hand, if $1-b+c>0$, and the roots are real, then both roots are either smaller than $1$ or larger than $1$. If both roots are smaller than $1$, obviously their sum is smaller than $2$, i.e. $b<2$, whereas if they are both larger than $1$, their sum is larger than $2$.

0
On

The roots of $x^2-bx+c=0$ are $x =\dfrac{b\pm\sqrt{b^2-4c}}{2} $.

If $c+1 > b > 2$, the smallest root is $\dfrac{b-\sqrt{b^2-4c}}{2} $ and $b^2 > 4c$ so we want $b-\sqrt{b^2-4c} \gt 2$ or $(b-2)^2 \gt b^2-4c $ or $b^2-4b+4 \gt b^2-4c $ or $c+1 > b$ which we are given.

0
On

Needless to calculate the roots.

Set $\;p(x)=x^2-bx+c$. If $p(x)$ has real roots and $c+1>b>2$, $\; p(1)=1-b+c>0$, hence $1$ does not separate the roots, i.e. both are smaller or greater than $1$.

Now they're smaller or greater than $1$ exactly when their arithmetic mean is, and this arithmetic mean is $\frac b2>1$ by hypothesis. The conclusion follows.

0
On

$$\begin{array}{rccccc} & c + 1 & > & b & > & 2 \\ \implies & \alpha\beta+1 & > & \alpha+\beta & > & 2 \\ \implies & \alpha\beta-\alpha-\beta+1 & > & 0 & > & 2-\alpha-\beta \\ \implies & (\alpha-1)(\beta-1) & \underbrace{>}_{(\star)} & 0 & \underbrace{>}_{(\star\star)} & -\left(\;(\alpha-1)+(\beta-1)\;\right) \\ \end{array}$$ Now, $(\star)$ implies that $\alpha-1$ and $\beta-1$ have the same sign, while $(\star\star)$ implies that that sign must be positive. Thus, $\alpha> 1$ and $\beta > 1$. $\square$