Problem Statement:-
If the cubic equation $f(x)=0$ has three real roots $\alpha$, $\beta$ and $\gamma$ such that $\alpha\lt\beta\lt\gamma$, show that the equation $$f(x)+2f'(x)+f''(x)=0$$ has a real root between $\alpha$ and $\gamma$.
Attempt at a solution:-
If $f(x)$ is a quadratic equation whose roots are $\alpha$, $\beta$ and $\gamma$, then $$f(x)=(x-\alpha)(x-\beta)(x-\gamma)$$ From this we get $f'(x)$ and $f''(x)$ as follows:- $$f'(x)=(x-\alpha)(x-\beta)+(x-\beta)(x-\gamma)+(x-\gamma)(x-\alpha)\\ =3x^2-2(\alpha+\beta+\gamma)x+(\alpha\beta+\beta\gamma+\gamma\alpha)$$ $$f''(x)=2(3x-(\alpha+\beta+\gamma))$$
$$\therefore f(x)+2f'(x)+f''(x)=x^3-(\alpha+\beta+\gamma-6)x^2+(\alpha\beta+\beta\gamma+\gamma\alpha-4(\alpha+\beta+\gamma)+6)x+(\alpha\beta\gamma+2(\alpha\beta+\beta\gamma+\gamma\alpha)-2(\alpha+\beta+\gamma))$$
Now, after seeing this humongous expression for $f(x)+2f'(x)+f''(x)$ and not able to find anything useful that could have shortened the solution, I thought that it would be fun to play around with the expression by putting $x=\alpha$ and $x=\gamma$ and then proving that both have opposite signs but it turned out to be totally opposite of fun, it just made me break down after spending a lot of time.
Your help is very essential to solve this problem. Maybe you can just give me a thought on how to approach the question in a not so disgusting manner as I did.
Here's a nice trick: Let $g(x)=f(x)e^x$. Then $g'(x)=(f'(x)+f(x))e^x$ and $g''(x)=(f''(x)+2f'(x)+f(x))e^x$. By what is given, $g$ has three roots $\alpha<\beta<\gamma$. By Rolle, $g'$ has a root strictly between $\alpha$ and $\beta$ as well as one strictly between $\beta$ and $\gamma$. Hence by Rolle again, $g''$ has a root between these two roots of $g'$. As $e^x\ne 0$, this root of $g''$ is also a root of $f+2f'+f''$.
Remark: We did not use the fact that $f$ is cubic. It suffices that $f$ is twice differentiable.