SVD with Sparsity Penalties

186 Views Asked by At

I am trying to solve a problem like

$$\min_{u,v} \Vert X-uv^T \Vert_F^2 + \beta \Vert v \Vert_1 \mbox{ s.t. } \Vert u \Vert_2 \le 1$$

where $X \in \mathbb{R}^{N\times N}$, $u, v \in \mathbb{R}^{N}$, which is basically like a rank-1 (truncated) SVD decomposition with sparsity constraints. I understand that this objective is not convex in both $u$ and $v$, but if the sparsity penalty is removed (and a similar constraint is added for $v$) it could be solved exactly using the SVD.

So my question is: Are there variants to SVD that can incorporate penalties/constraints like the $\ell_1$ norm? Would something like variable splitting (as in ADMM) work in this case and find the global minimum?

2

There are 2 best solutions below

0
On BEST ANSWER

The problem you mention is closely related to the sparse principal component analysis (Sparse PCA). There is a vast literature on this problem including greedy nonconvex algorithms and semidefinite relaxations. Just google "sparse pca" and you will find "highly cited" papers.

For your particular formulation, you might want to look at the recent literature on nonconvex optimization such as alternating minimization.

1
On

This is a special case of Jenathon's et al. "structured sparse pca" model with $r = 1$ component / latent factor. Also, your problem specializes the sparsity inducing $\Omega$ penalty to just an $\ell_1$-norm. The model can be optimizes by block coordinate descent via Algorithm 1 of the aforesaid paper, e.g