The proof of the representation of the minimum eigenvalue problem.

960 Views Asked by At

The optimization problem is $\lambda_{min}(A):= min <Y,A>: Y\succeq 0, trace(Y)=1,$ where $A$ and $Y$ are symmetric matrices. To get the minimum eigenvalue of $A$ is equivalent to solve the above optimization problem. I saw this on this link https://people.eecs.berkeley.edu/~wainwrig/ee227a/SDP_Duality.pdf, equation (13.3).

But I don't know why this is true. Please help me to prove it. Thanks in advance:)

2

There are 2 best solutions below

0
On BEST ANSWER

I did a little research and will reason the problem in the sense of optimization. First, we know the Rayleigh principle as following.

$\lambda_{min}(A)= \min_{x\in R^n} \dfrac{x^TAx}{x^Tx}=\min_{y=\frac{x}{||x||}} y^TAy=\min_{y=\frac{x}{||x||}} trace(Ayy^T)$.

Let $Y=yy^T$, we found the objective function becomes

$\min <A, Y>$

and the constraint set is $\mathcal{O}:=\{yy^T: y^Ty=1\}=\{Y: Y=Y^T, trace(Y)=1, I\succeq Y \succeq 0\} = \{Y: trace(Y)=1, Y \succeq 0\}$. The second equality is based on the lemma in https://ia801409.us.archive.org/25/items/onsumoflargestei00over/onsumoflargestei00over.pdf and the last equality comes after remove some constraints based on the property of $Y$.

Combining the new objective and constraints, we get the formulation asked by myself.
$\lambda_{min}(A)=\min <A, Y>: trace(Y)=1, Y \succeq 0$.

0
On

You may diagonalize $Y=\sum_i \mu_i e_i e_i^T$ (orthonormal basis) where $\mu_i\geq 0$ and $\sum_i \mu_i=1$ (the trace condition). Likewise we write $A=\sum_j \lambda_j u_j u^T_j$ (orthonormal basis, and $\lambda_1\leq \lambda_2\leq ...$). The scalar product then becomes: $$ \sum_i \mu_i e_i^T A e_i = \sum \lambda_j \left( \sum_i \mu_i |e_i^T u_j|^2 \right) \geq \lambda_1 \sum_i \left( \mu_i \sum_j |e_i^T u_j|^2 \right) =\lambda_1 \sum_i \mu_i =\lambda_1 $$ using that $u_j$ gives an orthonormal basis for the second last equality. The middle inequality is an equality if $\mu_1=1$, $e_i=u_1$.