Efficient Algorithm for finding MinMax of Quadratic Functions?

354 Views Asked by At

Given a set of n quadratic functions

$f_i(x) = x^T A_i x + b_i^T x + c_i$

with $x\in R^m$, $i\in[0,n-1]$ and $\det A_i < 0$, that is each of the functions has a maximum.

I am looking for $x$ that maximises the minimum of these functions, that is

$\arg \max_x min_i f_i(x)$

I assume that the MinMax Theorem doesn't apply because my functions $f_i$ are not convex, but concave because $\det A_i<0$.

In my specific application, the number of these functions $n$ can be in the thousands and the number of dimensions $m$ of the vector $x$ can be in the tens.

Example for n=3 and m=1. The solution in this case is x=0.

Can somebody please recommend an efficient algorithm for solving this?

1

There are 1 best solutions below

0
On BEST ANSWER

Introduce the probability simplex $$ Y=\{y\in{\Bbb R}^n\colon \sum_{i=1}^ny_i=1,\ y_i\ge 0\}. $$ Then $$ \min_{1\le i\le n}f_i(x)=\min_{y\in Y}\sum_{i=1}^n y_if_i(x) $$ and the function $$ F(x,y)=\sum_{i=1}^n y_if_i(x) $$ satisfies all the conditions of minimax, in particular, convex in $y$ and concave in $x$.

To solve the problem, however, it seems appropriate to rewrite differently $$ \min_i f_i(x)\ge\gamma\qquad\Leftrightarrow\qquad f_i(x)\ge\gamma,\quad\forall i $$ that gives the following equivalent problem $$ \max_{x,\gamma}\gamma\qquad\text{subject to } f_i(x)\ge\gamma,\ \forall i. $$ This is a convex optimization problem with many different numerical algorithms being available.