QR decompositon for singular matrices

2.3k Views Asked by At

I read in textbook that every $m$ by $n$ matrix with independent columns can be factored into $A=QR$. The columns of $Q$ are orthonormal, and $R$ is upper triangular and invertible.

I don't fully understand why $R$ is invertible when $A$ has independent columns. My thought is:

$A=QR\rightarrow Q^TA=R$ (multiply both sides by $Q^T$, and $Q^TQ=I$)

If $A$ is not square but has right inverse $A^{-1}$, then

$Q^TAA^{-1}Q=I=RA^{-1}Q$

So $A^{-1}Q$ is the inverse of $R$. But having right inverse requires independent rows, and I don't see how independent columns are related to it.

To clear my doubt, I looked at some other materials, and saw the following statement:

Every invertible matrix has a $QR$ decomposition, where $R$ is invertible. As a side note, it bears mentioning that this result holds even if the matrix is not invertible: Every matrix has a $QR$ decomposition, though $R$ may not always be invertible.

It confused me even more. If a matrix $A$ does not have independent columns, how will its $QR$ decomposition be like? As far as I know, $Q$ has pairwise orthogonal columns, all of unit length, and its shape is the same as $A$. But if $A$ doesn't have independent columns in the first place, how do we find the corresponding $Q$?

I am stuck and hope someone can help me make it clear. Any help is appreciated! Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

Regarding your original question:

We want to show that if $A$ has linearly independent columns and $A = QR$ with $R$ square and upper triangular (in fact I'll ignore the properties of $Q$), then $R$ must be invertible.

Note that $A$ has linearly independent columns if and only if $Ax = 0$ implies that $x = 0$ for $x \in \Bbb R^n$.

Now, suppose for the purpose of contradiction that $R$ fails to be invertible. It follows that the columns of $R$ fail to be linearly independent, which is to say that $Rx = 0$ has a non-zero solution. This contradicts our statement about $A$, however, since the same non-zero $x$ satisfies $$ Ax = (QR)x = Q(Rx) = Q(0) = 0. $$


Regarding the different kinds of $QR$-decompositions:

The confusion here stems from the fact that there are different versions of the $QR$ decomposition. Your "other materials" state the following:

Every invertible matrix has a $QR$ decomposition, where $R$ is invertible... Every matrix has a $QR$ decomposition, though $R$ may not always be invertible.

In this context, a "$QR$ decomposition" requires an orthogonal matrix $Q$ (so $Q$ is always square) and an upper-triangular matrix $R$ of the same size as $A$.

With this definition, $R$ will have the same rank as $A$ since we have $R = Q^TA$, and $Q^T$ is an invertible matrix.

On the other hand, your original source states the following:

Every $m$ by $n$ matrix with independent columns can be factored into $A=QR$. The columns of $Q$ are orthonormal, and $R$ is upper triangular and invertible.

In this context, $Q$ is not necessarily square, but $R$ is. Since both $Q$ and $R$ are smaller matrices, this is sometimes called a "thin" or "reduced" factorization. Since you are looking at a statement about matrices $A$ with linearly independent columns, let's focus on this case.

First, note that since the columns are linearly independent, we necessarily have $m \geq n$. In the case where $m > n$, we can break write the product as follows: $$ A = QR = \pmatrix{Q_1 & Q_2} \pmatrix{\tilde R\\0} $$ The sizes here are as follows. $Q_1: m\times n$, $Q_2: m \times (m-n)$, $R: n \times n$, $0: (m-n) \times n$.

In fact, this partition is confromable for a block-matrix product. With block-matrix multiplication, we find that $$ QR = \pmatrix{Q_1 & Q_2} \pmatrix{\tilde R\\0} = Q_1\tilde R + Q_20 = Q_1 \tilde R. $$ So, $A = Q_1 \tilde R$ is a "thin" $QR$-decomposition where the columns of $Q_1$ are orthonormal (but $Q_1$ is not necessarily square) and if $A$ has linearly independent columns, then $\tilde R$ is invertible.