Rank of a lower triangular block matrix

4.4k Views Asked by At

For $$A= \begin{bmatrix}B&0\\C&D\end{bmatrix}$$ where $B, C, D$ are matrices that may be rectangular, is it true or false that $$\text{rank}(A)=\text{rank}(B)+\text{rank}(D)$$

I think that if $C=0$ this is true since the rank of A is the number of linear independent columns, which is the number of linear independent columns of B and of D, but does C affect this relationship? Or is it still true that $\text{rank}(A)=\text{rank}(B)+\text{rank}(D)$ when C is nonzero?

2

There are 2 best solutions below

5
On BEST ANSWER

The rank of the overall block triangular matrix is greater than or equal to the sum of the ranks of its diagonal blocks. I.e.: $$\text{rank(A)} \ge \text{rank}(B) + \text{rank}(D).$$

Equality is not necessarily attained, as exemplified by the following matrix $$\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix},$$ which has rank 1 but diagonal blocks with rank 0.

We can prove this inequality in the following two stages:

  1. Prove the inequality for square $B$, $D$.

  2. Reduce the rectangular case to the square case.

Square case

To begin, suppose that $B$ and $D$ are square, and consider their full SVDs: \begin{align*} B &= U \Sigma V^* \\ D &= W \Lambda Q^*, \end{align*} where $U,V,W,Q$ are square unitary (othonormal) matrices containing singular vectors as columns, and $\Sigma, \Lambda$ are square diagonal matrices containing the singular values on the diagonal. Substituting these factorizations into the blocks of $A$ and factorizing, we see that the block-triangular matrix $A$ is unitarily equivalent to an actually-triangular matrix with diagonal elements given by the singular values: $$A = \begin{bmatrix} B \\ C & D \end{bmatrix}= \begin{bmatrix} U \Sigma V^* \\ C & W \Lambda Q^* \end{bmatrix}= \begin{bmatrix} U \\ & W \end{bmatrix} \begin{bmatrix} \Sigma \\ W^* C V & \Lambda \end{bmatrix} \begin{bmatrix} V^* \\ & Q^* \end{bmatrix}.$$ Now we can invoke the fact that the rank of a triangular matrix is greater than or equal to the number of its nonzero diagonal entries to arrive at the desired conclusion: $$\text{rank}(A) = \text{rank}\left(\begin{bmatrix} \Sigma \\ W^* C Q & \Lambda \end{bmatrix}\right) \ge \text{rank}(\Sigma) + \text{rank}(\Lambda) = \text{rank}(B) + \text{rank}(D).$$

Reduction of rectangular case to square case

If $B$ and/or $D$ are not square, simply delete enough "redundant" columns and/or rows to make them square, while keeping their rank the same. This is always possible to do, since

  • the rank of a matrix is less than or equal to the smallest dimension of the matrix, and

  • a matrix of rank $r$ must have at least $r$ linearly independent columns/rows that we can keep.

The process of row and column deletion can be characterized by left- and right- multiplying by "selection" matrices, created by taking an identity matrix and deleting rows or columns from it (see this question). Call the selection matrices associated with deleting rows and columns of $A$ by the names $S_R$ and $S_C$, respectively. We have: $$S_R A S_C = \begin{bmatrix} \tilde{B} \\ \tilde{C} & \tilde{D}, \end{bmatrix}$$ where $\tilde{B}, \tilde{D}$ have the same ranks as $B, D$, respectively, but are square. Using the fact that the rank of the product of matrices is less than the rank of any individual matrix in the product, and applying the result shown above for square matrices, we get \begin{align*} \text{rank}(A) &\ge \text{rank}(S_R A S_C) \\ &= \text{rank}\left(\begin{bmatrix} \tilde{B} \\ \tilde{C} & \tilde{D}, \end{bmatrix}\right) \\ &\ge \text{rank}(\tilde{B}) + \text{rank}(\tilde{D}) \\ &= \text{rank}(B) + \text{rank}(D), \end{align*} which is the desired inequality.

0
On

Here's an easy proof that $\mathrm{rank}(A) \geq \mathrm{rank}(B) + \mathrm{rank}(D)$. The idea is that you can just take linearly independent columns in $B$ and $D$, and the corresponding columns in $A$ are linearly independent.

More precisely, pick maximal sets $\{i_1, \ldots, i_n\}$ such that $\vec{b}_{i_1}, \ldots, \vec{b}_{i_n}$ is linearly independent and $\{j_1, \ldots, j_m\}$ such that $\vec{d}_{j_1}, \ldots, \vec{d}_{j_m}$ is linearly independent. I claim $\vec{a}_{i_1}, \ldots, \vec{a}_{i_n}, \vec{a}_{N+j_1}, \ldots, \vec{a}_{N+j_m}$ is linearly independent, where $N$ is the number of columns of $A$. We have $\vec{a}_{i_\ell} = (\vec{b}_{i_\ell}, \vec{c}_{i_\ell})$ and $\vec{a}_{N+j_\ell} = (\vec{0}, \vec{d}_{j_\ell})$.

Suppose $$ \sum_{\ell=1}^n \alpha_\ell (\vec{b}_{i_\ell}, \vec{c}_{i_\ell}) + \sum_{\ell=1}^m \beta_\ell (\vec{0}, \vec{d}_{j_\ell}) = (\vec{0}, \vec{0}).$$ Then from the first component, $$ \sum_{\ell=1}^n \alpha_\ell \vec{b}_{i_\ell} = \vec{0}. $$ By linear independence, each $\alpha_\ell=0$. Hence the second component reads $$ \sum_{\ell=1}^m \beta_\ell \vec{d}_{j_\ell} = \vec{0}. $$ Again by linear independence, each $\beta_\ell=0$.