SVD products and the approximation error using Frobenius norm

929 Views Asked by At

This is my first time posting a question here. The question is about quantifying the bound of error of SVD products using the Frobenius norm.

Suppose I have a rectangular & real matrix $A (m \times n, m < n)$. I perform the SVD on $A$ to get $A = USV^T$.

Applying the reduced rank approximation gives $A_k \approx U_kS_kV^T_k$

Ignoring $U, U_k$, I define two new matrices by

$T = SV^T (m \times n)$

$T_k = S_kV^T_k (k \times n)$

I want to quantify the error bound if I apply the the transpose of both matrices to some experiment model, denoting as a matrix ($K$) by

$\delta = \frac{KT^T_k}{KT^T}$

Applying the Frobenius norm gives

$\delta = \frac{||KT_k^T||_F}{||KT^T||_F}$

Using sub-multiplicative properties, the above equation can be written as

$\delta \leq \frac{||K||_F||T_k^T||_F}{||K||_F||T^T||_F}$

Then

$\delta \leq \frac{||K||_F||V_kS_k||_F}{||K||_F||VS||_F}$

As $V, V_k$ are orthogonal invariant, then

$\delta \leq \frac{||K||_F||S_k||_F}{||K||_F||S||_F}$

Finally,

$\delta \leq \frac{||S_k||_F}{||S||_F}$

which is reduced to

$\delta \leq \sqrt{\frac{\sigma_{k+1}^2+\dots+\sigma_{m}^2}{\sigma_{1}^2+\dots+\sigma_{m}^2}}$

Have I done anything wrong in any step? The result is completely the opposite of my linear algebra proof. I suspect that the sub-multiplicative thing is not right.

1

There are 1 best solutions below

1
On

Shouldn't your last equation be $$\delta \leq k/n\sqrt{\frac{\sigma_{1}^2+\dots+\sigma_{k}^2}{\sigma_{1}^2+\dots+\sigma_{\min(m,n)}^2}}$$ $||V_k||_F=tr(V_k^TV_k)=tr(I_k)=k$, similarly $||V||_F=n$
And also I think you made typo for $||S_k||_F$ and $||S||_F $.