Calculus approach to solve this Quadratic equation problem

963 Views Asked by At

Both roots of the equation $$(x-b) (x-c) +(x-a) (x-c) +(x-a) (x-b) = 0$$ are always positive , negative or real. Prove your result.

By solving this equation I got $3x^2 - 2(a+b+c)x +ab + bc + ca = 0$

Now by inspection of this quadratic : $a>0$ and $D = 4(a^2 + b^2 + c^2 -ab - bc - ca)$

I'm stuck here and unable to conclude whether $D>0$ , $D<0$ or $D = 0$

Also is there an alternative approach to solve these types of questions. Kindly suggest any other method (if any) too.

By various answers here now I know where I was stuck. But I just want to know alternate method to approach this question without having to do this much algebra.

2

There are 2 best solutions below

4
On BEST ANSWER

Edit : I divide this answer into 2 parts. The first one originates as a quick answer to the question; the second one has my preference because it provides an intuitive understanding of the situation.

Part 1 :

With your notations ($D$ being the discriminant):

$$\dfrac{D}{4} = a^2 + b^2 + c^2 -ab - bc - ca=\dfrac{1}{4}(2a-b-c)^2+\dfrac{3}{4}(b-c)^2$$

Thus, in general $D\geq 0$ with:

  • $D=0$ when simultaneously $b=c$ and $2a-b-c=0$, which means if and only if all $a=b=c$. (case of a single root).

  • $D>0$ otherwise. (two real roots).

Part 2 :

Before beginning we assume that $a,b,c$ are distinct and WLOG that $a<b<c$.

Dividing by $(x-a)(x-b)(x-c)$, we have the following equivalent equation .

$$\dfrac{1}{x-a}+\dfrac{1}{x-b}+\dfrac{1}{x-c}=0 \ \ \ (1)$$

In fact, set $f(x)$ for the LHS of (1); it is immediate that f'(x)<0, thus $f$ is a strictly decreasing function (see graphical representation example below). Consider interval $(a,b)$: $lim_{x \rightarrow a+0} f(x) =+\infty$ and $lim_{x \rightarrow b-0} f(x) =-\infty$. Thus there is a root $x_1$ in this interval. Besides, this root is unique because $f$ is strictly decreasing. The same for interval $(b,c)$.

Thus we have two real roots such that:

$$a<x_1<b<x_2<c$$

No more than two real roots may occur as we know from the first form of the equation.

enter image description here

1
On

$$f(x)=(x-a)(x-b)+(x-b)(x-c)+(x-c)(x-a)$$ Then, $$f(a)=(a-b)(a-c)$$ $$f(b)=(b-c)(b-a)$$ $$f(c)=(c-a)(c-b)$$

Clearly, the coefficient of $x^2$ in $f(x)$ is positive. Suppose there are no real roots. Then, $f(x)\gt0 \forall x\in\mathbb R$. $$(a-b)(a-c)\gt0$$ Without loss of generality, let $a\gt b$ and $a\gt c$. $$(b-c)(b-a)\gt 0\Rightarrow c\gt b$$ $$(c-a)(c-b)\gt 0\Rightarrow b\gt c$$ This leads to a contradiction. Thus, the graph must meet the $x$-axis and thus the roots are real.