Suppose there is the simple function:
\begin{align} f(x,y,z) &= (x-a)^2 + (y-b)^2 + (z-c)^2 + (x+y-S-z - d)^2 \end{align}
where $a,b,c,d$ are nonnegative constants, and $S$ is an integer. I want to find $x^*=(x,y,z, x+y-S-z))$ with the smallest distance from $(a,b,c,d)$, such that the entries in $x^*$ are greater or equal to zero. This motivates the following optimization problem:
\begin{align} \min & \qquad f(x,y,z) \\ \text{subject to} & \qquad -x \leq 0 \\ & \qquad -y \leq0 \\ & \qquad -z \leq0 \\ & \qquad -(x+y-S-z) \leq 0 \end{align} Now, I want to rewrite the problem in an equivalent form where I can eliminate the inequality constraints. First attempt, Try computing the lagrangian we get: \begin{align} L(x,y,z\lambda_1,\lambda_2, \lambda_3, \lambda_4) = f(x,y,z) - \lambda_1 x - \lambda_2 y - \lambda_3 z - \lambda_4(x+y-S-z) \end{align} I cannot see another formulation of this where I can rewrite this optimization problem in an equivalent form where I can get rid of the Lagrange multipliers? Is this possible?
In general, it's not possible getting rid of the inequalities, or the Lagrange multipliers. In other words, there isn't a system of equations you can solve.
If you wanna do this so badly, you can always compute a zero of the gradient, and if it doesn't satisfy the inequalities, try to find the minimum in your boundary.