Computing the minimal eigenvalue

519 Views Asked by At

One may compute the minimum eigenvalue of $ M \in \mathbb{S}_{++}^n$ by solving the problem \begin{align} &\text{minimize}_{ x} \quad x^TMx,\\ &\text{subject to} \quad \|x\|_2^2 \geq 1. \end{align}

The optimal value of the problem is the minimum eigenvalue of $M$ that is obtained when $x$ is the eigenvector of $M$ associated with its minimum eigenvalue.

Q: Determine whether the problem is convex. If not, please make convex relaxation to it so was to obtain a convex optimization.

Hint: One may linearize the function in the constraint at each iteration.

I know this problem is not convex since the constraint is non-convex. But how to relax the constraint?

1

There are 1 best solutions below

0
On

By saying relaxing, I assume you accept an approximate solution. Actually, the hint of linearization also implies that. Firstly, by extending from @Casey's suggetion, this constraint problem can be reformulated as follows. $$ minimize_x\hspace{1em}x^TMx\ +\ \phi_m(f_1(x)) $$ , where ϕ is the logarithmic penalty function. $$ \phi_m(f_i(x))=-\frac{1}{m}\sum_{i=1}^{\ \ p‎}log(-f_i(x)) $$ f_i are the constraint functions in the original problem that satisfying $$ f_i(x)\leq 0 $$ So, here in this case, $$ f_1(x)= 1-x^Tx $$ But we can linearize it as, $$ f_1(x)\approx f_1(x_0)+\triangledown f_1(x_0)(x-x_0)=1-x_0^Tx_0-2x_0^T(x-x_0) $$ Therefore, the original problem becomes an unconstraint problem for each m>0, $$ minimize_x\hspace{1em}mx^TMx\ -\ log(2x_0^T(x-x_0)+x_0^Tx_0-1) $$ Here we have an unconstrained convex problem, for which the cvx package in matlab, or any proper solvers, can solve. So you can iteratively find the optimal point x*(m) if x*(m) is close to the optimum at the previous iteration, say, x*(10m) is close to x*(m) if the iterative step for m is 10. By iteratively solving the problem for (m->+inf), the optimal point x*(m) will converge to the x* of the original problem.