Rank Factorisation as product of non-singular and orthogonal matrices

75 Views Asked by At

The statement comes from Page 20 of Rao (1973):

Let $A$ be $m\times n$ matrix of rank $r$, then there exists a nonsingular $M:m\times m$ and an orthogonal $N:n\times n$ such that

$$A = M\left( \begin{matrix} I_r & 0 \\0 & 0\end{matrix} \right) N$$

Can anyone give a hint about why this is true?

1

There are 1 best solutions below

0
On BEST ANSWER

I don't have the access to Rao's book, so I'd use a constructive approach via elementary matrices and hopefully I don't introduce more advanced topics. There are many other proofs though, see, for example, another "proof" via SVD.

First, it is equivalent to multiply an elementary matrix on the right when you apply a basic column operation (e.g. interchange two columns, add/subtract one column or its multiple to another column, etc.). Similarly, you can multiply elementary matrices on the left for row operations.

Second, the original problem is equivalent to find $M,N$ such that $$M^{-1}AN^{-1}= \left( \begin{matrix} I_r & 0 \\0 & 0\end{matrix} \right) $$

We'd find invertible matrices $P,Q$ s.t.

$$PAQ= \left( \begin{matrix} I_r & 0 \\0 & 0\end{matrix} \right) $$

i.e. $P=M^{-1}$ and $Q=N^{-1}$. You'd see that $P$ and $Q$ are just the products of a series of elementary matrices.

Assume $A$ to be an $m\times n$ matrix and $\operatorname{rank}A=r$.

  1. Switch columns to make the first $r$ columns to linearly independent. Therefore all the remaining $n-r$ columns can be written as linear combinations of the first $r$ columns.

  2. Subtract the such linear combinations from $n-r$ columns, so the remaining $n-r$ columns will become zero.

  3. Step 1 and 2 is equivalent to multiply a nonsingular matrix to $A$ on its right hand side. In other words, we can find a series of elementary matrices such that $AR_1...R_s$ is a matrix with first $r$ columns to be linearly independent and the remaining $n-r$ columns to be zeros. Let $A1=AR_1...R_s$. Let $Q=R_1...R_s$.

  4. Similarly we can make the first $r$ rows of $A1$ to be linearly independent and the remaining $m-r$ rows to be zero. Let the output matrix be $A2$. $A2$ then has a special form: $$A2 = \left( \begin{matrix} A_{11} & 0 \\0 & 0\end{matrix} \right) $$ Note that $A_{11}$ is $r\times r$ square matrix and it is invertible ($\operatorname{rank}A=\operatorname{rank}A_{11}=r$). Similarly it is equivalent to say that we can find a nonsingular matrix $P1$ such that $P1A1=A2$.

  5. Let $$P2 = \left( \begin{matrix} A_{11}^{-1} & 0 \\0 & I_{n-r}\end{matrix} \right) $$ Then $P2$ is clearly invertible. Let $P=P1P2$ which is invertible too.

  6. You can verify that $$P2A2 = \left( \begin{matrix} A_{11}^{-1}\cdot A_{11} & 0 \\0 & I_{n-r}\cdot 0\end{matrix} \right) =\left( \begin{matrix} I_r & 0 \\0 & 0\end{matrix} \right)$$ We then have for $P=P2P1$ and $Q$ $$PAQ= \left( \begin{matrix} I_r & 0 \\0 & 0\end{matrix} \right) $$ which is what we look for.

If you know SVD, it'd be even simpler. You can compute the SVD of $A$, $$A= U\left( \begin{matrix} \Sigma_r & 0 \\0 & 0\end{matrix} \right)V^T$$ where $\Sigma$ is a positive diagonal matrix, and $U,V$ Hermitian matrices. You just need to "extract" the diagonal multitudes and "combine" it with either $U$ or $V$.