Let $f(x)=x^3 + ax^2 + bx + c$ where $a,b,c \in \mathbb R$ such that if $f(r)=0$ , then $ f ' (r) \ne 0$ i.e. $f$ has no double-root in $\mathbb C$ i.e. $f$ has three distinct roots with at least one real root. Let $g(x) = 2 f''(x) f(x) - (f'(x))^2$.
Then how to show that $g$ has exactly two real roots ? If $r< s$ are the two real roots of $g(x)$, then how to show that $f(r) <0$ and $f(s) >0$ ?
Since $g'(x)=12 f(x)$, so $g$ has degree $4$ and since $f$ has no double root , so all the 4 roots of $g$ are also distinct. I am unable to say anything else. Please help.
As $g'(x)=12f(x)$, all local extrema of $g$ are at roots $r$ of $f$, $f(r)=0$. As these local extrema of $g$ the value $g(r)=-f'(r)^2<0$ is negative, even if $f$ has 3 real roots, the local maximum still has a negative value. As the leading term of $g$ is $3x^4$, for large $|x|$ the value of $g$ becomes positive. Thus there are roots of $g$ left and right of the root set of $f$. As $g$ is monotonous on those segments, there is exactly one root of $g$ left of the leftmost root of $f$ and one right of the rightmost one.
In other words, let $a=\min\{x\in\Bbb R:f(x)=0\}$ and $b=\max\{x\in\Bbb R:f(x)=0\}$. Then $g$ is negative on $[a,b]$ and monotonous falling resp. increasing on $(-\infty,a]$ and $[b,\infty)$ with a sign change and thus exactly one root in each of the intervals.
Or another way using more directly the degree of $g$: if $g$ had $4$ real roots $s_1\le s_2\le s_3\le s_4$, then $g(x)=3(x-s_1)(x-s_2)(x-s_3)(x-s_4)$ would take non-negative values in the interval $[s_2,s_3]$, thus also at the local maximum $r$ there, which is impossible by the first observation, $g(r)=-f'(t)^2$ and $f'(r)\ne0$ as $r$ is also one root of $f$, and a simple one at that.