Proof for the inequality $e^{\lambda x} \leq (\frac{1-x}{2}) e^{-\lambda} + (\frac{1+x}{2})e^{\lambda}$

95 Views Asked by At

As the title says, I am trying to show that the inequality

$e^{\lambda x} \leq (\frac{1-x}{2}) e^{-\lambda} + (\frac{1+x}{2})e^{\lambda}$ holds $\quad \forall \lambda \in \Re \quad$ and $\quad x \in [-1,1]$

Any help ?

3

There are 3 best solutions below

0
On BEST ANSWER

$\lambda x =(\frac {1+x} 2) (\lambda) +(\frac {1-x} 2) (-\lambda)$. Now use the fact that $e^{x}$ is a convex function.

0
On

Hint. Note that $t\to e^t$ is convex in $\mathbb{R}$ and therefore for any $t,s\in\mathbb{R}$ and $\alpha\in [0,1]$, $$e^{\alpha t+(1-\alpha)s}\leq \alpha e^{t}+(1-\alpha) e^{s}.$$ Now what are $t,s,\alpha$ in your case?

0
On

Instead of convexity in $\lambda$, we can look at concavity in $x$:

Note that $f(x)=\frac{1-x}2e^{-\lambda}+\frac{1+x}2e^\lambda-e^{\lambda x}$ is concave and $f(-1)=f(1)=0$.