For $$A= \begin{bmatrix}B&0\\C&D\end{bmatrix}$$ where $B, C, D$ are matrices that may be rectangular, is it true or false that $$\text{rank}(A)=\text{rank}(B)+\text{rank}(D)$$
I think that if $C=0$ this is true since the rank of A is the number of linear independent columns, which is the number of linear independent columns of B and of D, but does C affect this relationship? Or is it still true that $\text{rank}(A)=\text{rank}(B)+\text{rank}(D)$ when C is nonzero?
The rank of the overall block triangular matrix is greater than or equal to the sum of the ranks of its diagonal blocks. I.e.: $$\text{rank(A)} \ge \text{rank}(B) + \text{rank}(D).$$
Equality is not necessarily attained, as exemplified by the following matrix $$\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix},$$ which has rank 1 but diagonal blocks with rank 0.
We can prove this inequality in the following two stages:
Prove the inequality for square $B$, $D$.
Reduce the rectangular case to the square case.
Square case
To begin, suppose that $B$ and $D$ are square, and consider their full SVDs: \begin{align*} B &= U \Sigma V^* \\ D &= W \Lambda Q^*, \end{align*} where $U,V,W,Q$ are square unitary (othonormal) matrices containing singular vectors as columns, and $\Sigma, \Lambda$ are square diagonal matrices containing the singular values on the diagonal. Substituting these factorizations into the blocks of $A$ and factorizing, we see that the block-triangular matrix $A$ is unitarily equivalent to an actually-triangular matrix with diagonal elements given by the singular values: $$A = \begin{bmatrix} B \\ C & D \end{bmatrix}= \begin{bmatrix} U \Sigma V^* \\ C & W \Lambda Q^* \end{bmatrix}= \begin{bmatrix} U \\ & W \end{bmatrix} \begin{bmatrix} \Sigma \\ W^* C V & \Lambda \end{bmatrix} \begin{bmatrix} V^* \\ & Q^* \end{bmatrix}.$$ Now we can invoke the fact that the rank of a triangular matrix is greater than or equal to the number of its nonzero diagonal entries to arrive at the desired conclusion: $$\text{rank}(A) = \text{rank}\left(\begin{bmatrix} \Sigma \\ W^* C Q & \Lambda \end{bmatrix}\right) \ge \text{rank}(\Sigma) + \text{rank}(\Lambda) = \text{rank}(B) + \text{rank}(D).$$
Reduction of rectangular case to square case
If $B$ and/or $D$ are not square, simply delete enough "redundant" columns and/or rows to make them square, while keeping their rank the same. This is always possible to do, since
the rank of a matrix is less than or equal to the smallest dimension of the matrix, and
a matrix of rank $r$ must have at least $r$ linearly independent columns/rows that we can keep.
The process of row and column deletion can be characterized by left- and right- multiplying by "selection" matrices, created by taking an identity matrix and deleting rows or columns from it (see this question). Call the selection matrices associated with deleting rows and columns of $A$ by the names $S_R$ and $S_C$, respectively. We have: $$S_R A S_C = \begin{bmatrix} \tilde{B} \\ \tilde{C} & \tilde{D}, \end{bmatrix}$$ where $\tilde{B}, \tilde{D}$ have the same ranks as $B, D$, respectively, but are square. Using the fact that the rank of the product of matrices is less than the rank of any individual matrix in the product, and applying the result shown above for square matrices, we get \begin{align*} \text{rank}(A) &\ge \text{rank}(S_R A S_C) \\ &= \text{rank}\left(\begin{bmatrix} \tilde{B} \\ \tilde{C} & \tilde{D}, \end{bmatrix}\right) \\ &\ge \text{rank}(\tilde{B}) + \text{rank}(\tilde{D}) \\ &= \text{rank}(B) + \text{rank}(D), \end{align*} which is the desired inequality.