Least square regression and singular value decomposition

170 Views Asked by At

Let $A\in \mathcal{R}^{n\times d}$ be a matrix and its singular value decomposition be $A = U_A\Sigma_A V_A^T$. For any orthogonal matrix $U\in \mathcal{R}^{n\times l}$, let $U^\perp \in \mathcal{R}^{n\times (n-l)}$ be an orthogonal matrix whose column are orthonormal basis spanning the subspace of $\mathcal{R}^n$ that is orthogonal to $U$.

Then show that

$Z = \min_{x\in \mathcal{R}^d}\lvert\lvert Ax-b\rvert\rvert_2 = \lvert\lvert U^\perp_A U_A^{\perp^T}b\rvert\rvert_2$.

Note that this is the least square regression problem, and its exact solution is given by

$x^\star = A^+b = V_A\Sigma_A^{-1} U_A^Tb$.

Where $A^+$ is the Moore-Penrose pseudo inverse of $A$.

Reference this, Section 2.2.

1

There are 1 best solutions below

7
On

Hint 1:

$min_x \|f(x)\|_2 \Leftrightarrow min_x \|f(x)\|_2^2$

Hint 2:

$\|x\|_2^2 = x^Tx$

Can you differentiate this and get the point $x$ where the expression is minimized?

%%% Fixed the answer according to the author's comments:

Let $ e = b - Ax^\star$

Then, $ e = b - AA^+b = (I-AA^+)b = (I-U\Sigma \Sigma^+ U^T)b = U(I-\Sigma\Sigma^+)U^Tb$

So, the matrix in the middle (which is like the Eigenvalue decomposition) has diagonal entries to be $0$ corresponding to those $u_i$ which are the basis vectors of rangespace of $A$, but $1$ for the ones which are the basis vectors for the orthocomplement of rangespace of A.

So, $e \in \mathit{R^\perp(A)}$

Also, from the last equality, since only the vectors that contribute to the orthocomplement survive the matrix multiplication, we will have $e = U_\perp U_\perp^T b$.