A robust convex optimization problem

389 Views Asked by At

Consider a function $ f: \mathbb{R}^n \times \mathbb{R}^m \rightarrow \mathbb{R} $ such that $\forall x \in \mathbb{R}^n$ the map $f(x,\cdot)$ is convex, and $\forall y\in \mathbb{R}^m$ the map $f(\cdot,y)$ is convex as well. We assume that $m<n$.

Given $y^1, ..., y^N \in \mathbb{R}^m$, and $c \in \mathbb{R}^n$, consider the following convex optimization problem.

$$ \min_{ x \in \mathbb{R}^n } c^\top x \ \ \ \text{ sub. to: } \ f(x,y^i) \leq 0 \ \ \forall i \in [1,N] $$

Let $x^\star$ be its optimizer, supposed to be unique.

Notice that the above optimization problem is equivalent to the following one.

$$ \min_{ x \in \mathbb{R}^n } c^\top x \ \ \ \text{ sub. to: } \ f(x,y) \leq 0 \ \ \forall y \in \text{conv}( \{y^1, ..., y^N\} ) $$

We now say that $f(x,y^k) \leq 0$ is a "true" constraint if it "really matters", i.e. any optimizer of the problem

$$ \min_{ x \in \mathbb{R}^n } c^\top x \ \ \ \text{ sub. to: } \ f(x,y^i) \leq 0 \ \ \forall i \in ([1,N]\setminus\{k\}), $$

which we denote by $x_{ \neg k }^\star$, is such that $c^\top x_{ \neg k }^\star < c^\top x^\star$.

What is the number of true constraints?

3

There are 3 best solutions below

1
On

In generality, the removal of $y^k$ alters the feasible region iff $\operatorname{conv}(\{y^i, i=1,\ldots,N\}) \ne \operatorname{conv}(\{y^i, i=1,\ldots,N, i \ne k\})$. It is easy to construct an example, where all $y_i$ are extremal points of their convex hull. Note that, however, this may not change the optimal solution.

4
On

I guess that by "true" constraints you are meaning "active" or "blocking" constraints, that is constraints such that $f(x^*,y^i)=0$.

Because, by convexity, if a constraint does not block -- i.e., $f(x^*,y^i)<0$ -- then removing it from the set of constraints does not change the optimum.

To count the blocking constraints is quite easy a posteriori, that is once you know the optimum. In facts, most software provides you the "Lagrange multipier" $\frac {\partial x^*} {\partial y^i}$ which are null exactly when they do block.

To count the blocking constraints a priori, before to compute the optimum is a $NP$ problem. Even simplifying the convex hull of constraints $\{x|\forall i, f(x,y^i)\le0\}$ is $NP$.

Maybe you can find something starting there.

0
On

What are you calling a "true" constraint? Suppose you want to minimize $c(x)=x^2$ under some sets of constraints:

  1. $x\le1$ and $x\le2$, optimum $x^*=0$ [$f(x,y)=x-y, y_1=1, y_2=2$]. The second constraint is redundant, it can be eliminated because its a consequence of the first ($y_1 \lt y_2$).

  2. $x\le1$, optimum $x^*=0$. [$f(x,y)=x-y, y_1=1$]. This constraint is inactive, it in not eliminated because of other constraints but because it is true near the optimum ($x^*\lt y_1$).

  3. $x\lt-1$, optimum $x^*=-1$. [$f(x,y)=x-y, y_1=-1$]. This constraint is active, you cannot remove it without changing the optimum.

Type 3 is clearly a "true" constraint and type 1 clearly not. But what about type 2?