Is standard eigenvalue optimization problem convex

3.2k Views Asked by At

For any arbitrary symmetric matrix A , is the standard eigenvalue problem convex

$ \lambda_{max}(A)= \max_{\|x\| \leq1} x^{T}Ax$

1

There are 1 best solutions below

3
On BEST ANSWER

Thanks to Michael for catching my previous errors.

You need to be careful with terminology here.

In the following, I am assuming that $A$ is symmetric.

The function $\phi(A) = \max_{\|x\|\le 1} \langle x, Ax \rangle$ is always convex because it is the $\sup$ of a collection of convex (in fact, linear) functions $A \mapsto \langle x, Ax \rangle$.

However, the function $\phi$ is not necessarily equal to the maximum eigenvalue function. In general, $\phi(A) = \max(0, \lambda_\max(A))$.

If you restrict the $x$s, then we have $\lambda_\max(A) = \max_{\|x\| = 1} \langle x, Ax \rangle$ (which is a convex function of $A$ for the same reasons as above).

However, the optimisation problem $\max_{\|x\|\le 1} \langle x, Ax \rangle = -\min_{\|x\|\le 1} \langle x, (-A)x \rangle$ is not necessarily a convex program. If you restrict $A$ to the cone of negative semidefinite matrices, then the optimisation problem is convex, but the solution is trivially $0$.

For example, if $A=+1$, then the problem is $\max \{ x^2 | |x| \le 1 \}$ which is not a convex program.