Maximizing a function of the maximal and minimal eigenvalues

282 Views Asked by At

In my work, I encountered the following optimization problem.

Given symmetric, positive definite matrices $K_1, K_2, \dots, K_n$, find symmetric, positive definite matrix $R$ that minimizes

$$\max \{ |1-\alpha_{\max}|, |1-\alpha_{\min}| \}$$

where $\alpha_{\min}$ and $\alpha_{\max}$ are the minimal and maximal eigenvalues of $K_1 R, K_2 R, \dots, K_n R$.

Are there any explicit solutions for this problem? If not, are there any numerical algorithms?

Thanks in advance.


Edit 1: Now, I am looking for a numerical solution based on the gradient descent technique. The eigenvalues of $K_1 R, K_2 R, \dots, K_n R$ are the same as the symmetric tensors $S_1=\sqrt{R}K_1 \sqrt{R}, S_2=\sqrt{R}K_2 \sqrt{R}, \dots, S_n=\sqrt{R}K_n \sqrt{R}$. The derivatives of eigen value $\alpha$ with respect to the original Hermitian matrix $S_i$ is available in the link Derivatives of eigenvalues.

Shortly we have $\partial \alpha/\partial S_i=v\otimes v$ where $v$ is the asociated eigen vector. The remaining work is to find $\partial S_i/\partial R$. Are there explicit expressions for $\partial S_i/\partial R$ or $\partial \sqrt{R}/\partial R$?

1

There are 1 best solutions below

5
On

Let us consider the simplest case, i.e., the case where $n=1$. Rephrasing slightly:

Given symmetric and positive definite $\rm A$, find symmetric and positive definite $\rm X$ that minimizes

$$\max \big\{ | 1 - \lambda_{\max} (\mathrm A \mathrm X) |, | 1 - \lambda_{\min} (\mathrm A \mathrm X) | \big\}$$

Note that

$$\begin{aligned} \max \big\{ | 1 - \lambda_{\max} (\mathrm A \mathrm X) |, | 1 - \lambda_{\min} (\mathrm A \mathrm X) | \big\} &= \max \big\{ | \lambda_{\min} (\mathrm I - \mathrm A \mathrm X) |, | \lambda_{\max} (\mathrm I - \mathrm A \mathrm X) | \big\}\\ &= \max \big\{ \big| \lambda_{\min} \big( \mathrm I - \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} \big) \big|, \big| \lambda_{\max} \big( \mathrm I - \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} \big) \big| \big\}\\ &= \rho \left( \mathrm I - \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} \right) = \left\| \mathrm I - \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} \right\|_2 = \left\| \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} - \mathrm I\right\|_2\end{aligned}$$

Replacing the constraint $\mathrm X \succ \mathrm O$ (positive definite) with $\mathrm X \succeq \mathrm O$ (positive semidefinite), we obtain the following convex optimization problem

$$\begin{array}{ll} \text{minimize} & \left\| \mathrm A^{\frac 12} \mathrm X \mathrm A^{\frac 12} - \mathrm I\right\|_2\\ \text{subject to} & \mathrm X \succeq \mathrm O\end{array}$$