If $A$ and $B$ are matrices, $0$ is a zero matrix, and \begin{equation*} X= \begin{pmatrix} A& 0 \newline 0& B \end{pmatrix}, \end{equation*} prove that $\mathrm{rank}(X)=\mathrm{rank}(A)+\mathrm{rank}B)$.
Also, if the upper right zero matrix would be replaced with matrix $C$, that is, \begin{equation*} X= \begin{pmatrix} A& C \newline 0& B \end{pmatrix} \end{equation*} would it still be true that $\mathrm{rank}(X)=\mathrm{rank}(A)+\mathrm{rank}B)$?
Prove that the rank of a block diagonal matrix equals the sum of the ranks of the matrices that are the main diagonal blocks.
10.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
By the matrix equivalent result, we can find two invertible matrices $P$ and $Q$ such that: $$J_r=Q^{-1}AP,$$ where $J_r=\mathrm{diag}(1,\ldots,1,0,\ldots,0)$ with $r=\mathrm{rank}(A)$ ($r$ is the number of $1$).
Similary for $B$, there's $P'$ and $Q'$ such that: $$J_{r'}=Q'^{-1}BP',$$ where $r'=\mathrm{rank}(B)$.
Now with the invertible block matrices $S=\mathrm{diag}(Q,Q')$ and $T=\mathrm{diag}(P,P')$ we have: $$J=S^{-1}XT,$$ where $J=\mathrm{diag}(J_r,J_{r'})$ and it's clear that: $$\mathrm{rank}(X)=\mathrm{rank}(J)=\mathrm{rank}(J_r)+\mathrm{rank}(J_{r'})=\mathrm{rank}(A)+\mathrm{rank}(B).$$
On
Suppose $A \in \mathbb{C}^{n_A \times m_A}$, $B \in \mathbb{C}^{n_B \times m_B}$. Let $\pi_A: \mathbb{C}^{n_A} \times \mathbb{C}^{n_B} \to \mathbb{C}^{n_A}$, and $\pi_B: \mathbb{C}^{n_A} \times \mathbb{C}^{n_B} \to \mathbb{C}^{n_B}$, be the projections onto the relevant spaces. Note that $\ker \pi_A \cap \ker \pi_B = \{ 0 \}$.
Now suppose columns $i_1,...,i_h$ of $A$ and columns $j_1,...,j_k$ of $B$ are linearly independent. Then columns $i_1,...,i_h, j_1+m_A,...,j_k+m_A$ of $X$ are linearly independent. To see this suppose we have constants $\alpha_1,...,\alpha_h$, $\beta_1,...,\beta_k$ such that $$v = \sum_{l=1}^h \alpha_l X e_l + \sum_{l=1}^k \beta_l X e_{l+m_A} = 0$$ Note that if $l \in \{1,...,h \}$, then $\pi_A( X e_l) = A e_l$ (the two $e_l$s are in $\mathbb{C}^{n_A + n_B}$ and $\mathbb{C}^{n_A}$ respectively), and if $l \in \{1,...,k \}$, then $\pi_B( X e_{l+m_A}) = B e_l$.
Then we have $\pi_A(v) = \sum_{l=1}^h \alpha_l \pi_A( X e_l) = \sum_{l=1}^h \alpha_l A e_l = 0$, and hence $\alpha_l=0$. Similarly, $\pi_B(v) = \sum_{l=1}^k \beta_l \pi_B(X e_{l+m_A}) = \sum_{l=1}^k \beta_l B e_l = 0$, and hence $\beta_l = 0$.
It follows that columns $i_1,...,i_h, j_1+m_A,...,j_k+m_A$ of $X$ are linearly independent, and hence that $\operatorname{rk} X \ge \operatorname{rk} A + \operatorname{rk} B$.
To see the reverse inequality, suppose columns $i_1,...,i_p$ of $X$ are linearly independent. Let $I = \{ i_1,...,i_p \}$, and partition the indices into $K_A = \{i \in I | i \le m_A \}$, $K_B = \{i \in I | i > m_A \}$. Then we claim that columns $K_A$ of $A$ are linearly independent, and columns $K_B -\{m_A\}$ of $B$ are linearly independent.
To see this, suppose $\sum_{l \in K_A} \alpha_l A e_l = 0$. As above, we have $\sum_{l \in K_A} \alpha_l \pi_A( X e_l) = \pi_A(\sum_{l \in K_A} \alpha_l X e_l) = 0$, and since $\ker \pi_A \cap \operatorname{sp} \{ X e_l\}_{l=1}^{m_A} = \{0\}$, we have $\sum_{l \in K_A} \alpha_l X e_l = 0$, and hence $\alpha_l = 0$. It follows that the columns $K_A$ of $A$ are linearly independent.
Now suppose $\sum_{l \in K_B -\{m_A\}} \beta_l B e_l = 0$. Again we have $\sum_{l \in K_B -\{m_A\}} \beta_l B e_l = \sum_{l \in K_B -\{m_A\}} \beta_l \pi_B(X e_{l+m_A}) = \pi_B(\sum_{l \in K_B -\{m_A\}} \beta_l X e_{l+m_A} )= 0$. Since $\pi_A(X e_{l+m_A}) = 0$ for $l \in \{ 1,...,m_B\}$, it follows that $\sum_{l \in K_B -\{m_A\}} \beta_l X e_{l+m_A} = 0$, and hence $\beta_l = 0$. It follows that columns $K_B -\{m_A\}$ of $B$ are linearly independent.
This gives $\operatorname{rk} X \le \operatorname{rk} A + \operatorname{rk} B$.
Suppose $A$ is $n \times n$ and $B$ is $m \times m$.
Looking at it row-wise, if you put together a maximal subset of linearly independent rows among the first $n$, and another such set among the second $m$ rows, then you get a subset of linearly independent rows for $X$. This shows $\operatorname{rank}(X) \ge \operatorname{rank}(A) + \operatorname{rank}(B)$. Reverse the argument to get the other inequality.
If you now allow $$ \begin{equation*} X= \begin{pmatrix} A& C \newline 0& B \end{pmatrix} \end{equation*} $$ just take $A = B = 0$, and $C \ne 0$ to see that the equality need not hold.