Problem Statement:-
If $f(x)=0$ is a cubic equation with real roots $\alpha,\beta,\gamma$ in order of magnitudes, show that one root of the equation $f^\prime(x)=0$ lies between $\dfrac{\alpha+\beta}{2}$ and $\dfrac{2\alpha+\beta}{3}$ and the other root lies between $\dfrac{\beta+\gamma}{2}$ and $\dfrac{\beta+2\gamma}{3}$
My Attempt at a solution:-
Attempt-1:-
I thought that the best method would be substituting the given value b/w which the roots of $f'(x)=0$ are supposed to lie and then show that they are of different signs hence a root must lie in b/w them. But after executing this thought I figured pretty quickly that its going to be a dragged out solution and brings more harm than good.
Attept-2:-
Inspired by this problem of mine I thought that why not consider a function $g(x)$ such that $$g(x)=e^xf(x)$$
Then, $$g'(x)=e^x(f(x)+f'(x))$$ has roots in $(\alpha,\beta)$ and $(\beta,\gamma)$.
After this I was not able to come up with much to get a better bound for the roots to lie in $\left(\dfrac{2\alpha+\beta}{3},\dfrac{\alpha+\beta}{2}\right)$ and $\left(\dfrac{\beta+2\gamma}{3},\dfrac{\beta+\gamma}{2}\right)$
Your help would be very appreciated.
**Edit:-**Made changes to the intervals in which the roots are supposed to lie as pointed out by Adriano and dxiv
Extended hint: with the linear substitution $y = (x-\alpha) / (\beta - \alpha)$ the equation in $y$ will have the roots $0 \lt 1 \lt \lambda = (\gamma-\alpha) / (\beta - \alpha)$ so the polynomial in $y$ can be written as: $$g(y) \;=\; y(y-1)(y-\lambda) \;=\; y^3 - (1+\lambda)y^2 + \lambda y$$
Then:$$g'(y) \;=\; 3 y^2 - 2(1+\lambda)y + \lambda$$
is easily verified to have $2$ positive real roots for $\lambda \gt 1$. Let $\mu$ be the smallest one: $$\mu = \frac{1+\lambda-\sqrt{1-\lambda+\lambda^2}}{3}$$
It is again easily verified that $\mu$ is a strictly increasing function of $\lambda \in (1,\infty)$. Moreover:
$$\mu_{min} = \lim_{\lambda \to 1} \;\frac{1+\lambda-\sqrt{1-\lambda+\lambda^2}}{3} = \frac{1}{3}$$
$$\mu_{max} = \lim_{\lambda \to \infty} \;\frac{1+\lambda-\sqrt{1-\lambda+\lambda^2}}{3} = \frac{1}{2}$$
It follows that $\frac{1}{3} \lt \mu \lt \frac{1}{2}$ so the lowest root of $g'(y)$ is in $\left(\frac{1}{3},\frac{1}{2}\right)$. Going back to the $x$ domain using the inverse of the original substitution which is $x=(\beta-\alpha)y+\alpha$ gives the corresponding root of $f'(x)$ as being in $\left(\frac{2 \alpha + \beta}{3},\frac{\alpha+\beta}{2}\right)$.
Applying the result above to $f(-x)$ with roots $-\gamma \lt -\beta \lt -\alpha$ proves the second part for (the corrected) interval $\left(\frac{\beta+\gamma}{2},\frac{\beta+2\gamma}{3}\right)$.