I have the following optimization problem: \begin{align} \min_{x} \quad & x^\top \hat{x} \\ \text{s.t.} \quad & x^\top a_i \leq 0, \quad \forall i \in \{ 1,\ldots,N \} \\ & \|x\|_2 = 1, \end{align} where $x \in \mathbb{R}^n$ is the decision variable and $\hat{x}, a_1, \ldots, a_N$ are given vectors. In other words, the objective function is linear, we have a set of linear inequality constraints that define a polyhedral cone, plus an equality constraint that constraints $x$ to the unit sphere.
I am interested in proving hardness results for this problem. For example, proving (or disproving) that it is NP-hard in general, or showing that this problem can be solved efficiently using some kind of reformulation. So, my questions are:
- Is this problem NP-hard in general?
- If not, how can we efficiently solve it?
I can only give you a partial answer to the second question since I do not know a lot about the evaluation of the hardness of a problem. There are several numerical approaches, I will just give you an outline because each of them would take a sole answer.
1.) Relaxation method: Just consider $$ \Vert x \Vert_2 \leq 1$$ instead and you have a convex optimization problem. Then you pray that this is close enough. This might even work, of course strongly depending on your $\hat{x}$. Or maybe a transformation of $\hat{x}$ is needed. However, convex problems are supported by a large number of references and computational libraries/solvers.
2.) Penalization method: Replace the sphere constraint by adding some penalization term in your objective function e.g. a Ginzburg-Landau penalization $\frac{1}{\varepsilon}(\Vert x \Vert_2 -1)^2$. Of course, now you also have to tune the parameter $\varepsilon$ which often is quite subtle. Further, this is not a linear program anymore.
3.) Projection method: You completely ignore the sphere constraint, solve the problem and then in the second step simply project your solution back on the unit sphere. Convergence is of course non-trivial and further conditions such as orthogonality conditions might be needed. This step is also often called retraction in the context of differential geometry AFAIK.
For method 3, I recommend looking into the following reference, see e.g. p. 71 f. for a concrete algorithm of a related problem: N. Boumal, An introduction to optimization on smooth manifolds. Cambridge: Cambridge University Press (2023; Zbl 07633911)