In order to minimize a function, $f(x)$ with an equality constraint, $g(x) = 0$, we can minimize the Lagrange function $L(x, \lambda) = f(x) - \lambda g(x)$.
However, I want to minimize a function, $f(x)$, with an inequality constraint, $x \le 0$.
I think that I could approximate this by optimizing $J(x; \alpha) = f(x) + e^{\alpha x}$ for some huge number $\alpha$ because $$\lim_{\alpha\to\infty} e^{\alpha x} = \begin{cases} 0 & x < 0 \\ \infty & x > 0 \end{cases}$$
Which would make my objective function $J$ really high if I chose any positive $x$ values, and really low for negative values.
Would this work?