Maximizing $\|Ax\|$ subject to $\|x\|=\sqrt{n}$, and $|x_i|=1,\forall i$

58 Views Asked by At

I would like to discuss my thoughts on the following optimization problem:

$$\begin{align} \text{Maximize} \|Ax\| \text{ subject to } \|x\|=\sqrt{n}, \text{and } |x_i|=1,\forall i, && (1) \end{align}$$

where $x\in \mathbb{R}^n$ and check if my understanding is correct. In my opinion, I believe (1) can be equivalently reformulated as follows:

$$\text{Maximize } x^TA^TAx \text{ subject to } x^Tx=n, \text{and } x_i=\{-1,1\},\forall i.$$ The above problem is non-convex because both of the constraints are non-convex. Dropping the last constraint, we have:

$$\text{Maximize } x^TA^TAx \text{ subject to } x^Tx=n.$$ Although this problem is non-convex, we can solve this problem efficiently by constructing its dual. The solution is $x^*=\sqrt{n}z^*$ where $z^*$ is the eigenvector corresponds to the largest eigenvalue of $A^TA$, and its optimal value is $n \lambda_{max}$ where eigenvalue $\lambda_{max}$ is the largest eigenvalue of $A^TA$. This idea comes from rescaling following problem:

$$\text{Maximize } z^TA^TAz \text{ subject to } z^Tz=1,$$

If we keep the last constraint and drop the second one instead, then we have:

$$\text{Maximize } x^TA^TAx \text{ subject to } x_i=\{-1,1\},\forall i.$$ Then, this problem is in general a NP-hard problem and a simple method to obtain its optimal solution is exhaustive search. Since dropping the second constraint and keeping the last constraint is a NP-hard problem, then I believe it is quite reasonable to conclude that (1) is in general NP-hard as well.

I am not sure if my understanding is correct. Thus, I would like to invite the community to share their opinions and correct me if I am wrong.