I am using function gsl_linalg_SV_decomp provided by the GNU Scientific Library to solve a least-squares problem
$$\min \|Ax-b\|_2$$
where $A \in \mathbb R^{m \times n}$. The procedure is to first find the SVD of $A$
$$A = U\Sigma V^T$$
where $U \in \mathbb R^{m \times m}$ and $V \in \mathbb R^{n \times n}$ are orthogonal, i.e., $U^TU = UU^T = I_m$ and $V^TV = VV^T = I_n$.
However, gsl_linalg_SV_decomp returns a thin SVD, i.e., it provides a $U \in \mathbb R^{m \times n}$. Therefore, I need to extend the thin matrix $U$ to a full orthogonal matrix. I've been thinking about randomly appending some orthonormal columns to $U$, but I'm not sure if it is appropriate.
Update: After some derivations, I found that a thin SVD is actually enough for solving the least-squares problem if the residual will be ignored. But In my problem, there is a constraint $\|Ax-d\| < \gamma$ and a full SVD is required.
Suppose you have a tall matrix $\mathrm U_1 \in \mathbb R^{m \times n}$, whose $n$ columns are the left singular vectors that span the column space of rank-$n$ matrix $\rm A$. To obtain a tall matrix $\mathrm U_2 \in \mathbb R^{m \times (m-n)}$, whose $m-n$ columns are the left singular vectors that span the left null space of $\rm A$, find an orthornormal basis of the (right) null space of $\mathrm U_1^\top$ using, say, the RREF.