For a matrix $A$ the Singular Values Decomposition allows getting the closest low-rank approximation $$A_K=\sum_i^K\sigma_i \vec{v}_i \vec{u}_i^T$$ so that $\|A-A_k\|_F$ is minimal.
I'd like to do the same but allow for an additional diagonal term; that is, for a given square matrix $A$ and an positive integer $K$ find a diagonal matrix $D$ and low-rank approximation $$A_K=D+\sum_i^K\sigma_i \vec{v}_i \vec{u}_i^T$$ so that like above $\|A-A_k\|_F$ is minimal.
The problem originated in the context of correlations matrices. Thus answers which further assume $A$ is symmetric, positive semi-definite are also welcome.
Things are going to be tricky for this one. Rank and the Frobenius norm are unitarily invariant, but the property of being "diagonal" is not.
The best approach I can think of, off the top of my head, is as follows: we can define a matrix norm by $$ \|M\|_{F_K}^2 = \sum_{i=1}^K [\sigma_i(M)]^2 $$ Your question can then be re-framed as follows: for a matrix $A$, find the diagonal matrix $D$ such that $$ f(D) = \sum_{i=K+1}^n \sigma_i(A - D) = \|A - D\|_F^2 - \|A - D\|_{F_K}^2 $$ is minimized. If there's any hope of getting a nice formula for this, it will be from applying some kind of calculus/Largrange multiplier argument to this. Alternatively, this could presumably be made into some kind of quadratic or semidefinite program.