Trace-preserving low-rank symmetric matrix approximation

105 Views Asked by At

I have a symmetric $n\times n$ matrix $H$ and I want to make a $k$-rank approximation that preserves the trace. In symbols:

$$ H \approx J D J^\top, \quad \operatorname{Tr}(H) = \operatorname{Tr}(J D J^\top),$$

where $D$ is $k\times k$ diagonal and $J$ is $n\times k$.

My current idea is: I take the largest magnitude eigenvalue with the same sign of the trace and put it in $D_{11}$, and $J_1$ is the corresponding normalized eigenvector. Then I remove this eigenspace from the matrix: $H' = H - D_{11}J_1J_1^\top$, and repeat recursively on $H'$ until I have picked $k$ eigenspaces. Finally, I rescale $D$ so that $\operatorname{Tr}(D) = \operatorname{Tr}(H)$.

This is similar to the usual non-trace-preserving SVD cut, but it is different. Example: $k=1$, the largest magnitude eigenvalue is positive, but the trace is negative, so the minimum eigenvalue is picked instead.

In my problem the sign of eigenvalues matters so I do it this way to make sure the sign of eigenvalues is not flipped to preserve the trace.

Also, this can be implemented in $O(n^2 k)$ as the usual cut.

Does anyone know something about this kind of approximation? Is it something already studied? I could not find it.

(My problem is computing second order propagation of moments. $H$ is the hessian and for a moment of order $m$ the computation is $O(n^m)$ with full rank, and $O(n k^m)$ with rank $k$. The trace is the first order moment.)