Proving an inequality using Taylor's Theorem

486 Views Asked by At

I need to show that $ x^{1/3} < \frac{1}{3}x + \frac{2}{3} \forall x \in (0,1)$. I have been given the hint to consider the expression $\frac{1}{3}x - x^{1/3}$, but the Taylor Series centred at $x=0$ vanishes after only a two terms.

Do I have to centre the distribution about a different real number?

3

There are 3 best solutions below

0
On

Of course. $x^{\frac 1 3}$ has a vertical tangent in $x=0$, that's precisely why you are getting infinite derivatives.

0
On

Why do you need to use Taylor's series? Since $x \in (0,1)$, just take $y = x^{1/3} \in (0,1)$ too, and just show $y < y^3/3 + 2/3$, which is not hard to do.

0
On

The easiest proof I can see doesn't really use the hint: Since both expressions are non-negative, we can cube them without messing up the inequality, so it suffices to show that that x < (x/3 + 2/3)^3 everywhere on (0,1). Now consider f(x) = (x/3 + 2/3)^3 - x = x^3/27 + 2*x^2/9 -5*x/9 + 8/9. the derivative of f(x) is (x+5)(x-1)/9, which is negative everywhere on (0,1).

So f(x) is monotonically decreasing on (0,1). Thus for all x on (0,1), f(x) > f(1).

But f(1) = 0 so for all x on (0,1), f(x) > 0, thus (x/3 + 2/3)^3 > x, thus the cube root of x is less than x/3 + 2/3.