Prove that $e^{-ax} \leq (1 - x)^a + ax^2/2$ for $a > 1$, $0 \le x \le 1$

243 Views Asked by At

Here we assume that all variables are reals. I would like to show that $$ e^{-ax} \leq (1 - x)^{a} + \frac{1}{2} ax^2 \tag{1} $$ for all $a > 1$, $0 \leq x \leq 1$.

By Taylor’s theorem, I know that, in order to show that (1), it suffices to show that $$ (a - 1)(1 - \theta)^{a - 2} - a e^{-a \theta} + 1 \geq 0 \tag{2} $$ for all $a > 1$ and $0 \leq \theta \leq 1$. But it does not seem so easy to show (2). If we continue to calculate higher derivatives by $\theta$ of the left hand side of (2), we have to consider separately the cases of the values of $a$. It seems to me that my solution is complicated. It would be appreciated if you could give me a proof of either (1) or (2).

1

There are 1 best solutions below

6
On BEST ANSWER

Taking logarithm, it suffices to prove that $$f(x) := \ln \left((1 - x)^{a} + \frac{1}{2} ax^2\right) + ax \ge 0.$$

We have $$f'(x) = \frac{ax(ax + 2 - 2(1-x)^{a-1})}{ax^2 + 2(1-x)^a} \ge 0 \tag{1}$$ where we use $2 - 2(1-x)^{a-1} \ge 0$ since $a > 1$ and $x\in [0, 1]$.

Also, $f(0) = 0$. Thus, we have $f(x) \ge 0$ on $[0, 1]$.

We are done.