Linear Least Squares using SVD for rank deficient case

66 Views Asked by At

For the linear least squares problem:

$\min_\boldsymbol{x}{\|\boldsymbol{y}-A\boldsymbol{x}\|_2^2}.$

Using the SVD of $A\in\mathbb{R}^{m\times n}(m\geq n)$ given as:

$A=\sum_{i=1}^r\sigma_i\boldsymbol{u}_i\boldsymbol{v}_i^T$.

Where r>0 is the rank of matrix A (we consider the rank deficent case), I wish to prove that the minimizer is defined as:

$\boldsymbol{x}^*=\sum_{i=1}^r\frac{\boldsymbol{u}_i^T\boldsymbol{y}}{\sigma_i}\boldsymbol{v}_i+\sum_{i=r+1}^n\tau_i\boldsymbol{v}_i$

Now the first part of the expression I can obtain quite easily using:

$\begin{aligned} \|\mathbf{y}-A\mathbf{x}\|_2^2& =\|U^{\top}\mathbf{y}-\Sigma V^{\top}\mathbf{x}\|_{2}^{2} \\ &\left.\left.=\left\|\left[\begin{array}{c}U_r^\top\\U_{m-r+1}^\top\end{array}\right.\right.\right]\mathbf{y}-\left[\begin{array}{cc}\Sigma_r&0\\0&0\end{array}\right]\left[\begin{array}{c}V_r^\top\\V_{n-r+1}^\top\end{array}\right]\mathbf{x}\right\|_2^2 \\ &=\left\|U_{r}^{\top}\mathbf{y}-\Sigma_{r}V_{r}^{\top}\mathbf{x}\right\|_{2}^{2}+\left\|U_{m-r+1}^{\top}\mathbf{y}\right\|_{2}^{2}. \end{aligned}$

Where the minimizer $\boldsymbol{x}^* = V_{r}\Sigma_{r}^{-1}U_{r}^{\top}\mathbf{y}=\sum_{i=1}^{r}\frac{\mathbf{u}_{i}^{\top}\mathbf{y}}{\sigma_{i}}\mathbf{v}_{i}$

And the second term in the equation above represents the residuals. However I cant seem to find the second part of the solution $\sum_{i=r+1}^n\tau_i\boldsymbol{v}_i$. I suspect it comes from manipulating $\left\|U_{m-r+1}^{\top}\mathbf{y}\right\|_{2}^{2}$

Any pointers?