This seems simple, but I don't really know how to prove it, so I'd be thankful for any help.
For the following block matrix:
$A$, $B$, and $C$ are all $n\times n$ matrices. Entries of $A$, $B$, $C$ are arbitrary, $0$ is a zero matrix.
$$\operatorname{rk}\left(\left[\matrix{A&0\\0&B}\right]\right)\leqslant \operatorname{rk}\left(\left[\matrix{A&0\\C&B}\right]\right).$$
Let $A = \begin{bmatrix} \alpha_1 & \cdots & \alpha_n\end{bmatrix}$, $B = \begin{bmatrix} \beta_1 & \cdots & \beta_n\end{bmatrix}$, $C = \begin{bmatrix} \gamma_1 & \cdots & \gamma_n\end{bmatrix}$, where $\alpha_i, \beta_i, \gamma_i$ are column vectors of $A, B, C$ respectively.
Consider any linearly independent column vectors $\{\delta_{i_1}, \ldots, \delta_{i_p}, \xi_{j_1}, \ldots, \xi_{j_q}\}$ of the matrix $\begin{bmatrix} A & 0 \\ 0 & B \end{bmatrix}$, where $1 \leq i_1 < \cdots < i_p \leq n, 1 \leq j_1 < \cdots < j_q \leq n$. Note $p$ or $q$ can be $0$, for which case $\delta$ or $\xi$ is absent from the group. Here, $\delta_i$ are columns taken from $\begin{bmatrix} A \\ 0 \end{bmatrix}$, while $\xi_j$ are columns taken from $\begin{bmatrix} 0 \\ B \end{bmatrix}$.
Since $\{\delta_{i_1}, \ldots, \delta_{i_p}, \xi_{j_1}, \ldots, \xi_{j_q}\}$ are linearly independent, so are $\{\delta_{i_1}, \ldots, \delta_{i_p}\}$ and $\{\xi_{j_1}, \ldots, \xi_{j_q}\}$, which respectively implies that \begin{align} & a_1\alpha_{i_1} + \cdots + a_p\alpha_{i_p} = 0 \implies a_1 = \cdots = a_p = 0 \tag{1} \\ & b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0 \implies b_1 = \cdots = b_q = 0. \tag{2} \end{align}
Now we can show that the corresponding columns taken from $\begin{bmatrix} A & 0 \\ C & B \end{bmatrix}$ are linearly independent. In fact, if \begin{align} a_1\begin{bmatrix}\alpha_{i_1} \\ \gamma_{i_1} \end{bmatrix} + \cdots + a_p\begin{bmatrix}\alpha_{i_p} \\ \gamma_{i_p} \end{bmatrix} + b_1\begin{bmatrix}0 \\ \beta_{j_1} \end{bmatrix} + \cdots + b_q\begin{bmatrix}0 \\ \beta_{j_q} \end{bmatrix} = \begin{bmatrix}0 \\ 0 \end{bmatrix}, \end{align} then \begin{align} & a_1\alpha_{i_1} + \cdots + a_p\alpha_{i_p} = 0 \tag{3} \\ & a_1\gamma_{i_1} + \cdots + a_p\gamma_{i_p} + b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0 \tag{4} \end{align} $(3)$ and $(1)$ imply $a_1 = \cdots = a_p = 0$. Substitute $a_1 = \cdots = a_p = 0$ into $(4)$, we get $b_1\beta_{j_1} + \cdots + b_q\beta_{j_q} = 0$, which further implies $b_1 = \cdots = b_q = 0$, by $(2)$. That shows $a_1 = \cdots = a_p = b_1 = \cdots = b_q = 0$, i.e., $$\begin{bmatrix}\alpha_{i_1} \\ \gamma_{i_1} \end{bmatrix}, \ldots, \begin{bmatrix}\alpha_{i_p} \\ \gamma_{i_p} \end{bmatrix}, \begin{bmatrix}0 \\ \beta_{j_1} \end{bmatrix}, \ldots , \begin{bmatrix}0 \\ \beta_{j_q} \end{bmatrix}$$ are linearly independent.
In summary, we showed that, for any group of linearly independent column vectors from $\begin{bmatrix} A & 0 \\ 0 & B \end{bmatrix}$, its corresponding group of column vectors from $\begin{bmatrix} A & 0 \\ C & B \end{bmatrix}$ is also linearly independent. Since the rank of a matrix is defined as the cardinality of the maximal linearly independent group, the proof is complete.