Maximizing minimum of linear functions which are subject to linear inequality constraints.

289 Views Asked by At

The problem is to maximize $\min(x_{1},x_{2},\dots,x_{n})$, where $X = QR$ subject to $AR\leq b$ and $ 0 \leq r_{i} \leq 1$. The solution that I have came up with is to iteratively minimize $\sum_{i=1}^{n} (K/x_{i})^{m}$ where $K$, which is for numerical stability, is updated every iteration with the smallest $x_{i}$. $m$ is gradually increased in every iteration. Theoretically, when $m$ tends to infinity, $x$ also tends to the solution. For the minimization problem, standard convex optimization algorithms can be used.

This algorithm seems to work well but it takes very very long time. Is there any efficient algorithm that I can use to solve this? I thought about the projected gradient descent but as far as I now it needs continuity of gradient to converge.

1

There are 1 best solutions below

0
On

To maximize the minimum, introduce a new variable $z$ to represent $\min(x_1,\dots,x_n)$, introduce linear constraints $z \le x_i$, and maximize $z$.