Does anybody have suggestions for approximating $f(x) = \max (0, 1 - \exp (x))$ with a function that is at least twice differentiable, strictly greater than or equal to $0$ across its domain, and not prone to introducing numerical issues into nonlinear optimization programs (NLPs)?
I would like to include $0 \leq y \leq f(x)$ as a constraint in an optimization problem without having to resort to using integer/boolean variables, hence I need some sort of continuous approximation. Here $y$ is some other variable. Importantly, $f$ must not go negative, otherwise the problem will become infeasible.
I tried multiplying $f$ with various sigmoidal functions, but they invariably do a poor job near the origin, or worse, go negative. E.g., see the figure where $f(x) \approx (1-\exp(x)) \times (1+\exp(100x))$. For my application it is important that $f$ very rapidly goes to $0$ when approaching the origin from the negative (going to the positive) axis, but never itself goes negative. Does anybody have any ideas?
You can rewrite $\max(0,x)$ as $\frac{x+|x|}{2}$. Then approximate $|x|$ as $\sqrt{\epsilon+x^2}$, where you can make $\epsilon$ as small as needed, but positive.
Putting all together $$f(x)=\frac{1-e^x+\sqrt{0.01+(1-e^x)^2}}{2}>\max(0,1-e^x)$$
Additionally, resulting function is infinitely differentiable.