Understanding the loss of orthogonality of Classical Gram-Schmidt Procedure

263 Views Asked by At

I am having a hard time understanding the "loss of orthogonality" of classical Gram Schmidt procedure (CGSP) and how the Modified Gram Schmidt procedure (MGSP) resolves this issue.

The following two pseudocodes are for CGSP and MGSP : \begin{bmatrix} \begin{array} $\textbf{function[Q,R]=CGSP(A)}\\ {\bf for~} j=0, \ldots, n-1 \\ ~~~ a_j^\perp := a_j \\ ~~~ {\bf for~} k=0, \dots, j-1 \\ ~~~ ~~~ \rho_{k,j} := q_k^H a_j^\perp \\ ~~~ ~~~ a_j^\perp := a_j^\perp - \rho_{k,j} q_k \\ ~~~ {\bf end} \\ ~~~ \rho_{j,j} := \| {a}_j^\perp \|_2 \\ ~~~ q_j := a_j^\perp / \rho_{j,j} \\ {\bf end} \end{array} &\qquad\qquad \begin{array} $\textbf{function[Q,R]=MGSP(A)}\\ {\bf for~} j=0, \ldots, n-1 \\ ~~\\ ~~~ {\bf for~} k=0, \dots, j-1 \\ ~~~ ~~~ \rho_{k,j} := a_k^H a_j \\ ~~~ ~~~ a_j := a_j - \rho_{k,j} a_k \\ ~~~ {\bf end} \\ ~~~ \rho_{j,j} := \| {a}_j \|_2 \\ ~~~ a_j := a_j / \rho_{j,j} \\ {\bf end} \end{array} \end{bmatrix}

On MATLAB, I tried to consider a large matrix of size $100\times 100$ and performed QR decomposition using CGSP, after which I computed $\|I-QQ^{T}\|_{2}$ and received a value of $1.991\times10^{-1}$ which unlike the MGSP gave me a value of $2.012\times10^{-15}$. Therefore, I wish to understand the following :

  1. Geometrically, how is there a loss of orthogonality of $q$ when we project the $k^{th}$ column vector $a$ onto all previous $k-1$ vectors? That is to say why does this type of loss exist geometrically?
  2. How does MGSP resolve this issue? From the algorithm above for MGSP in what line does it really solves this issue of loss of orthogonality?