Linear optimization: is it possible to implement "$\min$" function linearly?

1.8k Views Asked by At

I am working on an Optimization problem with an objective function that includes a "min" function. Consider $g$ as a discrete bounded function. The value of $g(i)$ must be ignored if $g(i)>0$ and must be considered if $g(i)<0$. In other words, if we show the objective function by $f$, the objective is

$$\max_{i} f(i) = \left\{ f_1(i) + \min (0,g(i))\right\}$$

How the $\min$ function can be implemented in the objective function linearly? Is it possible to use an auxiliary variable like $z_i$ as

$$ \max_{i,z_i} f(i) = \left\{f_1(i) - z_i\right\}$$ subject to:

\begin{align} z_i &\ge 0, \\[0.2cm] z_i &\ge -g(i) \end{align}

Thanks.

1

There are 1 best solutions below

1
On BEST ANSWER

Your reformulation is correct, although the subscript for $z$ is not necessary. Note that the reformulation is only correct since your objective is maximization. It is less confusing to have $z=\min(0, g(i))$ (you have it as the negative of this expression): $\max_{i, z} \{ f_1(i)+z \; : \; z \leq 0, \; z \leq g(i)\}$