How do I prove that $$e^x \leq x + e^{x^2}$$ for all $x\in\mathbb R$?
My probability book (Grimmett and Stirzaker) says that it's a simple exercise but I don't see it. For $x\leq 0$, we have $$e^x = \sum_{k=0}^\infty \frac{x^{2k}}{(2k)!} + x + \sum_{k=1}^\infty \frac{x^{2k+1}}{(2k+1)!} \leq \sum_{k=0}^\infty \frac{x^{2k}}{k!} + x = e^{x^2} + x.$$ How do I show it for $x>0$?
How about this:
For $x\geq 0$, we have $$ x e^{-x}+e^{x^2-x}\geq x(1-x)+1+x^2-x=1$$ where we used the inequality $e^{u} \geq 1+u$ for all $u$. Multiplying on both sides by $e^x$, we find $$x+e^{x^2}\geq e^x$$ For $x<0$, note that $e^{x^2}\geq 1+x^2$, then $$ x+e^{x^2}\geq1+x+x^2=3/4+(x+1/2)^2>0 $$ Consequently, we also have $$ \left(x+e^{x^2}\right)e^{-x}\geq(x+1+x^2)(1-x)=1-x^3, $$ which implies that $$ x+e^{x^2}\geq(1-x^3)e^x>e^x $$