How can one prove that if the discriminant of a quadratic function is strictly positive, the the quadratic function has distinct roots.

371 Views Asked by At

It is well known fact that given $f(x)=ax^2+bx+c, \ a \neq 0$ and $\Delta=b^2-4ac>0$, then $f(x)$ has two distinct roots.I assume we are in $\mathbb{R}$ at any stage of the problem.

My Attempt

To prove this I considered that $f(x)=a \left(x+\frac{b}{2a}-\frac{\sqrt{\Delta}}{2a}\right) \left(x+\frac{b}{2a}+\frac{\sqrt{\Delta}}{2a}\right)$ upon applying Gaussian completion of squares.

Then I said: suppose the roots are not distinct, then $-\sqrt{\Delta}=\sqrt{\Delta}$. This will mean $\Delta=0$. Thus we remain with choice $\Delta \neq 0$.

From here, I don't know how to get to next stage.

4

There are 4 best solutions below

0
On BEST ANSWER

Actually, from what I gather, you are correct and almost done with your proof. Notice that you began with $\Delta>0$, and you assumed that the roots were not distinct. This brought you to $\Delta=0$. Is this consistent with $\Delta>0$? If not, what does it imply from your assumption?

$\Delta=0$ is a direct contradiction to the hypothesis, and it stems from the assumption that the roots of the quadratic function are equal. Therefore, it can only be that the roots must be distinct, as you intended to prove.

0
On

Since $a \neq 0$, divide your equation by $a$ and you get the equation of the form: $$x^2+bx +c$$ Now:
$x^2+bx +c = x^2+bx+(\frac{b}{2})^2 - (\frac{b}{2})^2 + c = 0 \implies (x+\frac{b}{2})^2 = (\frac{b}{2})^2 - c$

In order for this to have real solutions, we require: $$\Delta = (\frac{b}{2})^2 - c \geq 0$$

Since you found out that roots are not distinct only when $\Delta = 0$, and we know that they exist if $\Delta > 0$, they have to be distinct.

0
On

Starting from

$$ax^2+bx+c$$ if $a\ne 0$ we have $$ax^2+bx=-c \iff x^2+\frac bax\color{red}{+\frac{b^2}{4a^2}}=\color{red}{\frac{b^2}{4a^2}}-\frac ca$$ $$\left(x+\frac{b}{2a}\right)^2=\frac{b^2-4ac}{4a^2}\equiv \frac{\Delta}{4a^2}$$ $$\left(x+\frac{b}{2a}\right)^2= \frac{\Delta}{4a^2} \tag 1$$ If $\Delta <0$, there are no real solutions since the left-hand side is non-negative and $4a^2>0$; if $\Delta=0$, we have $$\left(x+\frac{b}{2a}\right)^2=0\implies x=-\frac{b}{2a}$$ The solutions are $x=-\frac{b}{2a}$ (multiplicity 2). If $\Delta >0$, eq. $(1)$ becomes

$$\left(x+\frac{b}{2a}\right)^2- \left(\frac{\sqrt{\Delta}}{2a}\right)^2 =0,$$ i.e.,

$$x_1+\frac{b}{2a}+\frac{\sqrt{\Delta}}{2a}=0 \quad \vee \quad x_2+\frac{b}{2a}+\frac{\sqrt{\Delta}}{2a}=0$$ which are the distinct solutions of an equation of degree 2.

0
On

Let $$Ax^2+Bx+C=0$$ have $2$ real roots $r_1$ and $r_2$ such that:

$$r_1=\frac{-B+\sqrt{B^2-4AC}}{2A}$$

$$r_2=\frac{-B-\sqrt{B^2-4AC}}{2A}$$

We can use the above formula since the question did not imply otherwise.

The discriminant has one of the following cases:

  1. $<0:$, which means the roots are not real - This is out of the scope of the question.

  2. Zero: the roots are of the same value $\frac{-B}{2A}$.

  3. $>0$: the roots must have different real values.