Finding full SVD from reduced SVD

546 Views Asked by At

I am no math expert, so please forgive me for a lack of proper terminology, I will do my best:

I am trying to create an algorithm no external dependencies as a personal exercise. The task is to implement the least squares algorithm. From my reading I have learned about SVD, singular values, and Moore-Penrose pseudo-inverse.

I have found an algorithm by Golub and Reinsch which calculates the reduced SVD (http://people.duke.edu/~hpgavin/SystemID/References/Golub+Reinsch-NM-1970.pdf):

$A = \hat U \hat \Sigma V^*$

I also read that for:

$Ax = b$

$x = (A'A)^{-1}A'b$

If there is no solution (due to small error), A is not invertible, so Moore-Penrose Pseudoinverse is used.

$A^+=(A'A)^{-1}A'$ and $A^+=V\Sigma^+U^*$

My question is how do I expand $\hat U$ to be a full m x m matrix (given that A is m x n)?

Additionally I was wondering how error for each part of least squares is calculated.