Indicator constraint in optimization

272 Views Asked by At

I am trying to formulate an optimization problem with the following constraint: $y = 1$ if $x \le c$ and $y = 0$ if $x > c$ which is basically an indicator function $y = 1[x \le c]$ and $c$ would be the decision variable here (hence $y$).

I noticed that big-M is the typical way of dealing with indicator constraints but it only works for the opposite direction: e.g., if $y = 1$ then $x \le c$.

I wonder if there's a way of formulating this as an optimization constraint. Thank you!

Update: The optimization problem would be something like \begin{align*} & \max \;\; f(y) \\ & \text{s.t.} \;\; y = 1[x \le c]\\ & y = \{0,1\}, c \in \mathbb{R} \end{align*} where $x$ is a fixed parameter. The objective function is a function of $y$, so we would like to optimize the choice of $c$ (which can be thought of as a threshold) such that $f(y)$ is maximized/minimized.

2

There are 2 best solutions below

3
On BEST ANSWER

You want to enforce two implications: \begin{align} x \le c &\implies y = 1 \\ x > c &\implies y = 0 \end{align} Equivalently, you want to enforce their contrapositives: \begin{align} y = 0 &\implies x > c \tag1\label1 \\ y = 1 &\implies x \le c \tag2\label2 \end{align} Assume $L_c \le c \le U_c$ for constants $L_c$ and $U_c$. As you suggested, \eqref{2} can be enforced by a big-M constraint: $$x - c \le (x-L_c)(1-y)$$ If you introduce a small constant tolerance $\epsilon>0$, \eqref{1} can also be enforced by a big-M constraint: $$c - x + \epsilon \le (U_c-x+\epsilon)y$$


Another way to think about this is that you want $x \le c \le U_c$ if $y=1$ and $L_c \le c \le x-\epsilon$ if $y=0$. So combine these two constraints as $$xy + L_c(1-y) \le c \le U_c y + (x-\epsilon)(1-y)$$

1
On

$ (1-y)L_c \le c-x \le yU_c + \epsilon(y-1)$
where $0 \gt U_c$ is ub & $ L_c \lt 0$ is lb for $c$ & $0 \lt \epsilon$ is a small number.