How to get a full $m$-by-$m$ matrix $U$ from a thin SVD?

315 Views Asked by At

I am using function gsl_linalg_SV_decomp provided by the GNU Scientific Library to solve a least-squares problem

$$\min \|Ax-b\|_2$$

where $A \in \mathbb R^{m \times n}$. The procedure is to first find the SVD of $A$

$$A = U\Sigma V^T$$

where $U \in \mathbb R^{m \times m}$ and $V \in \mathbb R^{n \times n}$ are orthogonal, i.e., $U^TU = UU^T = I_m$ and $V^TV = VV^T = I_n$.

However, gsl_linalg_SV_decomp returns a thin SVD, i.e., it provides a $U \in \mathbb R^{m \times n}$. Therefore, I need to extend the thin matrix $U$ to a full orthogonal matrix. I've been thinking about randomly appending some orthonormal columns to $U$, but I'm not sure if it is appropriate.


Update: After some derivations, I found that a thin SVD is actually enough for solving the least-squares problem if the residual will be ignored. But In my problem, there is a constraint $\|Ax-d\| < \gamma$ and a full SVD is required.

2

There are 2 best solutions below

8
On BEST ANSWER

Suppose you have a tall matrix $\mathrm U_1 \in \mathbb R^{m \times n}$, whose $n$ columns are the left singular vectors that span the column space of rank-$n$ matrix $\rm A$. To obtain a tall matrix $\mathrm U_2 \in \mathbb R^{m \times (m-n)}$, whose $m-n$ columns are the left singular vectors that span the left null space of $\rm A$, find an orthornormal basis of the (right) null space of $\mathrm U_1^\top$ using, say, the RREF.

8
On

All you need for least squares or pseudoinverse is the thin SVD. The extra columns in $U$ and $V$ in the full SVD are nulled out in the pseudoinverse $V\Sigma^{-1}U^T$due to the zero entries in the lower right portion of the diagonal of the full $\Sigma^{-1}$.

GNU Scientific Library is providing what most people need (use) for SVD. Calculating the full SVD is more "expensive" than thin SVD.