Maximize scalar product with argument in bounded intersection of half spaces

254 Views Asked by At

Given $a,v_1,\dots,v_n\in \mathbb{R}^d$, $n,d\in\mathbb{N}$, how to compute a maximizer of $$ f: \begin{cases} \{x\in\mathbb{R}^d\vert\; \|x\|_2\le1\land \forall i\in \{1,\dots,n\}\;\langle x,v_i\rangle\le 0\}&\to\mathbb{R}\\\\ x&\mapsto \langle x,a\rangle \end{cases} $$ In other words I want to find a direction $x^*$ that produces a maximal scalar product with the vector $a$ among all directions that lie in the convex region specified by some hyperplane inequalties with hyperplanes that go through the origin $0\in\mathbb{R}^d$.

My questions:

  1. Does this problem have a specific name?
  2. How to solve it algorithmically?
  3. What is the complexity of such an algorithm in terms of $n$ and $d$?
1

There are 1 best solutions below

0
On BEST ANSWER

The problem is colloquially written as $$\max_x\left\{a^Tx : v_i^Tx \leq 0 \; \forall i, ||x||\leq1\right\}.$$ This is a second order cone optimization problem. The dual problem is: $$\min_y \left\{ ||a+\sum_i y_i v_i|| : y\geq 0 \right\}. $$ The dual problem is known as nonnegative least squares (the $a$ here is $y$ on Wikipedia, and the $v_i$ here are the columns of $-A$ on Wikipedia). That means that there is no closed-form solution.

You can solve either the primal or the dual problem as a generic second order cone optimization problem. Interior point methods have polynomial complexity.

If you prefer an algorithm tailored to nonnegative least squares, the optimal solution to your original problem is $x^* = (a+\sum_i y^*_i v_i) / ||a+\sum_i y^*_i v_i||$ (with $y^*$ the optimal dual solution).