Matrix weighted least-squares with nuclear norm regularization

413 Views Asked by At

Nuclear norm minimization is very popular and the formulation of least-squares with nuclear norm regularization is as following,

$$\min\limits_{X \in \Bbb R^{3 \times 3}} \frac12 \| X -Y \|_F^2 + \lambda\| X \|_{*}.$$

The minimization problem above can be solved via singular value thresholding (SVT) and the least-squares term is based on the hypothesis that the noise is Gaussian and i.i.d..

However, if the noise is Gaussian with different variance, we can get the weighted least squares as following,

\begin{equation} \min\limits_{X \in \Bbb R^{3 \times 3}} \frac12 \left\| \big( X -Y \big) W \right\|_F^2 + \lambda\|X \|_{*} \end{equation}

where $W = \mbox{diag} (w_1, w_2, w_3)$. How to solve the optimization problem above? Any suggestion or reference paper is appreciated.

2

There are 2 best solutions below

1
On

If you already know how to solve the first type of problem, an obvious approach is to make the second problem look like the first.

Towards that end, let $$\eqalign{ X' & = XW \cr Y' &= YW \cr \lambda' &= \lambda \, \|W^{-1}\|_* \cr }$$

In terms of the primed variables, the second problem now looks like the first.

This transformation relies on the fact that the nuclear norm is sub-multiplicative.

0
On

This problem can be solved use standard proximal gradient descent algorithms, and this

is a good guidance to it. Hope it helps.