Linear algebra - Givens rotation when diagonal becomes zero

388 Views Asked by At

I'm trying to wrap my head around Given's rotations to generate a QR matrix from an squared A matrix. I understand the general case, but there is a question I can not get out of my head.

Let's imagine for a moment that I have the following matrix

\begin{bmatrix} A_1 & B_1 & C_1 \\ A_2 & B_2 & C_2 \\ A_3 & B_3 & C_3 \end{bmatrix}

But the values of A1 = B1 and A2 = B2, so: \begin{bmatrix} A_1 & A_1 & C_1 \\ A_2 & A_2 & C_2 \\ A_3 & B_3 & C_3 \end{bmatrix}

After two steps of Givens rotations I will be left with:

\begin{bmatrix} \tilde{A_1} & \tilde{A_1} & \tilde{C_1} \\ 0 & 0 & \tilde{C_2} \\ 0 & \tilde{B_3} & \tilde{C_3} \end{bmatrix}

The problem is that when I try to make the third step, B2 will be equal to 0, so I won't be able to properly use the given rotation. I understand there are easy ways to patch this (*), but I was wondering if there was any formal way to do it.

(*) The patch would be the following:

  • If the entire column below the diagonal is 0, then I have already finish and I can move to the next column. In the example above, I simply proceed to column 3. \begin{bmatrix} \tilde{A_1} & \tilde{A_1} & \tilde{C_1} \\ 0 & 0 & \tilde{C_2} \\ 0 & 0 & \tilde{C_3} \end{bmatrix}

  • If the entire column is not 0, then I can use a permutation matrix and swap the position of the diagonal with a position that it's not 0. A permutation matrix IS orthogonal, so it won't spoil the algorithm, and will not ruin previous columns either.

With the example above, the following permutation matriz \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix} Would get you \begin{bmatrix} \tilde{A_1} & \tilde{A_1} & \tilde{C_1} \\ 0 & \tilde{B_3} & \tilde{C_3} \\ 0 & 0 & \tilde{C_2} \end{bmatrix}

EDIT:

Answer in the comments.

1

There are 1 best solutions below

0
On BEST ANSWER

The question was answered in the comments (thanks algebraic pavel). If the matrix has a pivot in the diagonal, then the Given rotation for that number will be:

\begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}

If multiplied by any matriz, \begin{bmatrix} 0 & B \\ C & D\end{bmatrix}

Then the answer will be

\begin{bmatrix} C & D \\ 0 & -B\end{bmatrix}

Which means that the algorithm actually does a row exchange(-ish) for us! There is no need for patches or special cases. If any mod reads this, please close it as solved.