a simple inequality for exponential functions

109 Views Asked by At

Ho to prove the following inequality:

$e^{x}\leq1+x+x^2$

for $|x|\leq1/2$.

It looks simple, but I don't know where to start.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $g(x)=1+x+x^2-e^x$, so $g(0)=0$.

If you can use Taylor polynomials,

then $e^x=1+x+\frac{x^2}{2!}+\frac{e^c}{3!}x^3$ where $c$ is between 0 and $x$,

so $g(x)=\frac{x^2}{2!}-\frac{e^c}{3!}x^3$.

Then $g(x)>0$ for $-\frac{1}{2}\le x<0$,

and $g(x)>0$ for $0<x\le\frac{1}{2}$ since $\frac{e^c}{3!}x^3<\frac{2}{6}x^3\le\frac{2}{6}\left(\frac{1}{2}\right)x^2<\frac{x^2}{2!}$

(since $e^c<e^{1/2}<2$ because $e<4$)


If you can't use Taylor polynomials, you can still define $g(x)=1+x+x^2-e^x$ as above

and use that $g(0)=0, \; g^{\prime}(0)=0,\;$ and $g^{\prime\prime}(x)=2-e^x$:

$\textbf{1)}$ If $g(x)<0$ where $0<x\le\frac{1}{2}$,

then $g^{\prime}(c)=\frac{g(x)-g(0)}{x-0}=\frac{g(x)}{x}<0$ for some $c$ in $(0,x)$ by the Mean Value Theorem

so $g^{\prime\prime}(d)=\frac{g^{\prime}(c)-g^{\prime}(0)}{c-0}=\frac{g^{\prime}(c)}{c}<0$ for some $d$ in $(0,c)$ by the Mean Value Theorem.

Then $e^d>2$, and this gives a contradiction since $e^d<e^{1/2}<2$.

$\textbf{2)}$ If $g(x)<0$ where $-\frac{1}{2}\le x<0$,

then $g^{\prime}(c)=\frac{g(x)-g(0)}{x-0}=\frac{g(x)}{x}>0$ for some $c$ in $(x,0)$ by the Mean Value Theorem

so $g^{\prime\prime}(d)=\frac{g^{\prime}(c)-g^{\prime}(0)}{c-0}=\frac{g^{\prime}(c)}{c}<0$ for some $d$ in $(c,0)$ by the Mean Value Theorem.

Then $e^d>2$, and this gives a contradiction since $e^d<e^0=1<2$.

Therefore $g(x)\ge0$ for $\big|x\big|\le\frac{1}{2}$.