Fast computation/estimation of the nuclear norm of a matrix

3.2k Views Asked by At

The nuclear norm of a matrix is defined as the sum of its singular values, as given by the singular value decomposition (SVD) of the matrix itself. It is of central importance in Signal Processing and Statistics, where it is used for matrix completion and dimensionality reduction.

A question I have is whether it's possible to compute the nuclear norm of a given matrix faster than the time needed to compute the SVD. Since we don't need all the singular values but only their sum, this seem possible. Alternatively, perhaps it could be possible to approximate it with simulation methods and/or random projections.

2

There are 2 best solutions below

2
On

A tentative answer: the nuclear norm of $A$ is the trace of $\sqrt{A^*A}$ where $A^*$ is the conjugate-transpose of $A$, and $\sqrt{\cdot}$ is the matrix square root. So provided you can calculate matrix square roots faster than singular value decompositions, this might be useful.

0
On

This is just a guess, but since the sum of the singular values of A is the maximimum of tr(A*U) where U is orthogonal, might one be able to estimate this by computing the maximum sweeping over the givens rotations ?

The maximum, over theta of Tr( A*G(i,j,theta) is

Tr(A) + hypot(A(i,i)+A(j,j), A(i,j)-A{j,i)) - A(i,i) - A(j,j)

If this exceeds Tr(A) one could replace A with A*G(i,j,theta^) and try another i,j.