Is gaining enough understanding of SVD/QR factorization until you have their formulas memorized useful?

53 Views Asked by At

Is knowing how to compute SVD/QRfactorization,Power iteration/power method by hand without notes useful in applications such as Statistics?

Now adays, we can do everytime by Mathematica/computer without really understanding the mechanics of matricies. Does knowing the manual computation cold assist us with anything?

I 've taken linear algebra, but everytime I have to do a QR factorization or SVD, I have to look up the formula. It doesn't make intuitive sense where I have it memorized, should I take another linear algebra class that assumes basic knowledge of linear algebra, but focuses specifically on matrix decompositions?

1

There are 1 best solutions below

0
On BEST ANSWER

I think if we understand these factorizations sufficiently well then we can rederive them on the spot. For example, if we know that QR factorization encodes the Gram-Schmidt process, then we can see what the formula for the QR factorization is and a way to compute it (by doing the Gram-Schmidt process).

For the SVD, one should understand that $v_i$ is chosen to be the unit vector $v$ which maximizes $\| Av \|$ subject to the constraint that $v$ is orthogonal to the vectors $v_1, \ldots, v_{i-1}$. If we define a nonnegative scalar $\sigma_i$ and unit vector $u_i$ so that $A v_i = \sigma_i u_i$, and if we are good at block matrix multiplication, then we can see immediately that $$ \tag{1}AV = U \Sigma, $$ where $V$ is the matrix whose columns are the vectors $v_i$, $U$ is the matrix whose columns are the vectors $u_i$, and $\Sigma$ is a diagonal matrix with diagonal entries $\sigma_i$. Multiplying both sides of (1) on the right by $V^{-1} = V^T$, we see that $A = U \Sigma V^T$, and we can't forget it.

We could equivalently say that $v_i$ is chosen to be the unit vector $v$ that maximizes $\| Av\|^2 = v^T A^T A v$ subject to the constraint that $v$ is orthogonal to the vectors $v_1, \ldots, v_{i-1}$. If we are familiar with the variational characterization of the eigenvalues of a symmetric matrix, we now recognize that the vectors $v_i$ are eigenvectors of the matrix $A^T A$. This gives us a way to compute them, if we are able to compute eigenvectors of a symmetric matrix (which is not necessarily easy or obvious).

(There are more efficient algorithms for computing these factorizations which are not obvious.)