Nuclear norm regularization of symmetric matrix

341 Views Asked by At

I have an optimization problem of the form

$$\min_{X \in \mathbb{R}^{n \times n}} g(X) - \lambda \|X\|_{*}$$

where function $g$ is convex and differentiable. I would like to use proximal gradient descent to solve this. How do I handle the constraint that $X$ be symmetric? Or, formulated differently, how do I make sure that $X$ remains symmetric throughout my updates?

1

There are 1 best solutions below

1
On BEST ANSWER

The right way to do this is not to treat the symmetry of $X$ as a constraint. Rather, it is to make the space of symmetric matrices the domain of the function.

So for example, treat this not as a function of $X\in\mathbb{R}^{n\times n}$, but rather as a function of $X\in\mathcal{S}^n$, the set of $n\times n$ symmetric matrices.

The most important consequence of this way of thinking is that the gradient $\nabla g(X)$ will itself be a symmetric matrix. So you have to be careful in how you compute the gradient so that this is correctly achieved. For example, if $g(X)=\tfrac{1}{2}\|X-P\|_F^2$, where $P$ is symmetric, then $\nabla g(X) = X - P$.