Comparing ranks of block matrices

384 Views Asked by At

This seems simple, but I don't really know how to prove it, so I'd be thankful for any help.

For the following block matrix:

$A$, $B$, and $C$ are all $n\times n$ matrices. Entries of $A$, $B$, $C$ are arbitrary, $0$ is a zero matrix.

$$\operatorname{rk}\left(\left[\matrix{A&0\\0&B}\right]\right)\leqslant \operatorname{rk}\left(\left[\matrix{A&0\\C&B}\right]\right).$$

2

There are 2 best solutions below

0
On BEST ANSWER

Let $A = \begin{bmatrix} \alpha_1 & \cdots & \alpha_n\end{bmatrix}$, $B = \begin{bmatrix} \beta_1 & \cdots & \beta_n\end{bmatrix}$, $C = \begin{bmatrix} \gamma_1 & \cdots & \gamma_n\end{bmatrix}$, where $\alpha_i, \beta_i, \gamma_i$ are column vectors of $A, B, C$ respectively.

Consider any linearly independent column vectors $\{\delta_{i_1}, \ldots, \delta_{i_p}, \xi_{j_1}, \ldots, \xi_{j_q}\}$ of the matrix $\begin{bmatrix} A & 0 \\ 0 & B \end{bmatrix}$, where $1 \leq i_1 < \cdots < i_p \leq n, 1 \leq j_1 < \cdots < j_q \leq n$. Note $p$ or $q$ can be $0$, for which case $\delta$ or $\xi$ is absent from the group. Here, $\delta_i$ are columns taken from $\begin{bmatrix} A \\ 0 \end{bmatrix}$, while $\xi_j$ are columns taken from $\begin{bmatrix} 0 \\ B \end{bmatrix}$.

Since $\{\delta_{i_1}, \ldots, \delta_{i_p}, \xi_{j_1}, \ldots, \xi_{j_q}\}$ are linearly independent, so are $\{\delta_{i_1}, \ldots, \delta_{i_p}\}$ and $\{\xi_{j_1}, \ldots, \xi_{j_q}\}$, which respectively implies that \begin{align} & a_1\alpha_{i_1} + \cdots + a_p\alpha_{i_p} = 0 \implies a_1 = \cdots = a_p = 0 \tag{1} \\ & b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0 \implies b_1 = \cdots = b_q = 0. \tag{2} \end{align}

Now we can show that the corresponding columns taken from $\begin{bmatrix} A & 0 \\ C & B \end{bmatrix}$ are linearly independent. In fact, if \begin{align} a_1\begin{bmatrix}\alpha_{i_1} \\ \gamma_{i_1} \end{bmatrix} + \cdots + a_p\begin{bmatrix}\alpha_{i_p} \\ \gamma_{i_p} \end{bmatrix} + b_1\begin{bmatrix}0 \\ \beta_{j_1} \end{bmatrix} + \cdots + b_q\begin{bmatrix}0 \\ \beta_{j_q} \end{bmatrix} = \begin{bmatrix}0 \\ 0 \end{bmatrix}, \end{align} then \begin{align} & a_1\alpha_{i_1} + \cdots + a_p\alpha_{i_p} = 0 \tag{3} \\ & a_1\gamma_{i_1} + \cdots + a_p\gamma_{i_p} + b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0 \tag{4} \end{align} $(3)$ and $(1)$ imply $a_1 = \cdots = a_p = 0$. Substitute $a_1 = \cdots = a_p = 0$ into $(4)$, we get $b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0$, which further implies $b_1 = \cdots = b_q = 0$, by $(2)$. That shows $a_1 = \cdots = a_p = b_1 = \cdots = b_q = 0$, i.e., $$\begin{bmatrix}\alpha_{i_1} \\ \gamma_{i_1} \end{bmatrix}, \ldots, \begin{bmatrix}\alpha_{i_p} \\ \gamma_{i_p} \end{bmatrix}, \begin{bmatrix}0 \\ \beta_{j_1} \end{bmatrix}, \ldots , \begin{bmatrix}0 \\ \beta_{j_q} \end{bmatrix}$$ are linearly independent.

In summary, we showed that, for any group of linearly independent column vectors from $\begin{bmatrix} A & 0 \\ 0 & B \end{bmatrix}$, its corresponding group of column vectors from $\begin{bmatrix} A & 0 \\ C & B \end{bmatrix}$ is also linearly independent. Since the rank of a matrix is defined as the cardinality of the maximal linearly independent group, the proof is complete.

0
On

Let $$V = \operatorname{Span}\left(\left[\matrix{A\\O}\right]\right),$$ $$W = \operatorname{Span}\left(\left[\matrix{A\\C}\right]\right),$$ $$U = \operatorname{Span}\left(\left[\matrix{O\\B}\right]\right)$$ be the column spans. The desired result is $\dim(V+U) \le \dim(W+U)$. With $$\pi = \left[\matrix{I&O\\O&O}\right],$$ observe that $V = \pi(W)$, the image of $W$ under $\pi$. Note that the result almost immediately falls out of this fact - since multiplication of matrices can only reduce the trace, we have $\dim V \le \dim W$ - hence the "obviousness" of the result. The trouble is we have to be careful about what happens when we add $U$ into the mix.

So $$r\left(\left[\matrix{A&O\\C&B}\right]\right) = \dim (W+U)$$ $$ = \dim W + \dim U - \dim(W \cap U).$$ Let $$U' = \operatorname{Span}\left(\left[\matrix{O\\I}\right]\right) \supseteq U,$$ so that $W \cap U' \supseteq W \cap U$, and $$\dim W + \dim U - \dim (W \cap U) \ge \dim W + \dim U - \dim (W \cap U').$$ If we show that $$\dim W - \dim(W \cap U') = \dim V,$$ then we are done because $\dim(V+U) = \dim(V) + \dim(U) - \dim(V\cap U) = \dim V + \dim U$ as a consequence of $V \cap U = \{0\}$.

By the rank nullity theorem applied to $\pi|_W$, which has kernel $W \cap U'$, we have $$\dim W = \dim(W \cap U') +\dim V.$$