Solving the trace optimization problem $L\Sigma L^T$ for symmetric matrices

943 Views Asked by At

Let $d',d \in \mathbb{N}$, with $d' \le d$ and let $\Sigma$ be a symmetric matrix of dimension $d$. I'm trying to solve the following problem:

$$\max_{\substack{L \in \mathcal{M}_{d'\times d}(\mathbb{R}) \\LL^T = I}} \quad trace\left(L \Sigma L^T\right)$$

I can prove that, when $\Sigma$ is positive semidefinite, a solution can be found by adding to each row of $L$ the eigenvectors of $\Sigma$ associated to its $d'$ largest eigenvalues. I have also read some papers where they use this fact assuming only that $\Sigma$ is symmetric.

However, in the proof I used when $\Sigma$ is positive semidefinite, I use an inequality that depends critically on the non negativity of the eigenvalues of $\Sigma$, so that if there were negative eigenvalues, that inequality would be false. That's why I can't generalize this proof to any symmetric matrix.

How can I make a proof for this general case? Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

I suspect that your inequality would work just fine with negative values.

That being said, we can extend your result as follows. If $\Sigma$ is symmetric, then there exists an $\alpha > 0$ such that $\Sigma + \alpha I$ is positive semidefinite. We note that for any $L$ with $LL^T = I$, $$ \operatorname{trace}(L(\Sigma + \alpha I)L^T) = \operatorname{trace}(L\Sigma L^T) + \alpha \operatorname{trace}(LL^T) = \operatorname{trace}(L\Sigma L^T) + \alpha d' $$ Conclude that if $L$ maximizes $\operatorname{trace}(L(\Sigma + \alpha I)L^T)$, then it also maximizes $\operatorname{trace}(L\Sigma L^T)$.