Minimax theorem for convex quadratic programming

29 Views Asked by At

I have a simple and stupid question if I have a convex quadratic optimization problem with polyhedral constraints as follows: $$ \begin{aligned} \inf_{x \in \mathbb{R}^{n}} & x^{\top} Ax + b^{\top}x\\ & \text{s.t. } Hx \le h. \end{aligned} $$

If I introduce the Lagrange function as below: $$ \inf_{x\in\mathbb{R}^{n}} \sup_{\beta \ge 0} x^{\top}Ax + b^{\top}x + \beta^{\top}(Hx-h), $$ under which optimality condition $$ \inf_{x\in\mathbb{R}^{n}} \sup_{\beta \ge 0} x^{\top}Ax + b^{\top}x + \beta^{\top}(Hx-h) = \sup_{\beta \ge 0} \inf_{x\in\mathbb{R}^{n}} x^{\top}Ax + b^{\top}x + \beta^{\top}(Hx-h) $$ holds?

As conditions of Von Neumann's Minimax theorem do not hold here, how should I consider the minimax theorem?

Any hints would be appreciated!