If I have N unit orthogonal vectors of length N $\phi_{i,N\times 1}$ obtained from SVD of a $N\times M$ matrix $U$ : $$ U_{N\times M} = \sum_i^N \sigma_i\phi_{i,N\times 1}\times\psi_{i,1\times M}\\ =\Phi\times\Sigma\times\Psi\\ =[\phi_1,\phi_2, ...,\phi_N]_{N\times N} \times diag(\sigma_i) \times [\psi_1;\psi_2;...;\psi_N]_{N\times M} $$
$$ \phi_i^T\times\phi_j = \delta_{ij} $$ My question is: how close is the Cartesian product of K such vectors to identity matrix? And in what measure? $$ R^K = \sum_{i=1}^K{\phi_{i}\times\phi_i^T} \approx I $$ in which $R^K$ is K- order reconstruction matrix
background:
The background question is how accurate the signal reconstruction based on SVD is?
I am doing SVD signal reconstruction in the domain of fluid mechanics (a.k.a. Proper Orthogonal Decomposition, POD). I read the theory of SVD, it decompose the signal into three matrix: and use part of it to reconstruct the original signal without too much loss of accuracy:
$$ U_{N\times M} = \sum_i^N \sigma_i\phi_{i,N\times 1}\times\psi_{i,1\times M}\\ =\Phi\times\Sigma\times\Psi\\ =[\phi_1,\phi_2, ...,\phi_N]_{N\times N} \times diag(\sigma_i) \times [\psi_1;\psi_2;...;\psi_N]_{N\times M} $$ SVD: U is sampling matrix, N timesteps with M points, $\sigma_i$ is singular values sorted in decreasing order, $\phi_i$ is the unit orthogonal temporal mode vector and $\psi_i$ is the unit orthogonal spatial mode vector:
$$ I = \Phi\times\Phi^T = \Phi^T\times\Phi = \Psi\times\Psi^T $$
Usually, the algorithm is to solve eigen-value problem of matrix $U\times U^T$ and we can get a series of eigen values $\lambda_i = \sigma_i^2$ and corresponding temporal mode vectors $\phi_i$. And then the spatial mode is reconstructed using the orthogonality of temporal mode vectors: $$ \Psi = \Sigma^{-1}\times\Phi^T\times U $$
During reconstruction, I deduced this formulae (is it correct?): $$ U_R^K = \sum_{i=1}^K{\sigma_i\phi_{i}\times\psi_i}\\ =(\sum_{i=1}^K{\phi_{i}\times\phi_i^T})\times U\\ = R^K\times U $$ in which, $U_R^K$ is a K-order reconstruction of sample matrix U, $R$ is the reconstruction matrix. We can prove that if $K=N$, $R^K$ is identical to $I$. My question is how close is $R^K$ to I.
I tried to use inequality of norm:
$$ ||A\times B||\le ||A||\cdot ||B|| $$ so, $$ ||U_R^K - U|| = ||(R^K - I)\times U||\le ||R^K - I||\cdot ||U|| $$ I wanted to reach a conclusion like this:
if the characteristic velocity in flow field is 1000m/s, the characteristic error of reconstructed flow field is no more than $||R^K - I||\times 1000m/s$.
So I tried to use $||A||_{max}$ which choose the maximum of absolute value of elements in $A$, but it seems that the $||A||_{max}$ is not a sub-multiplicative norm and does not satisfies the $||A\times B||\le ||A||\cdot ||B||$ inequality. According to the answer here, a better choice is Frobenius norm$||A||_F = \sqrt{trace(A^T\times A)}$ which is related to root mean square of velocity.
Then the only question is: how close $R^K$ is to $I$ in Frobenius norm? I am not major in math, but according to practices, the $R^K$ is really close to $I$ if the eigen values included captured most of the trace of correlation matrix. there must be some internal mathematical relationship that I do not know.