Why is the following true?
Given a matrix $A \in \mathbb R^{n\times m}$ with rank $r \le \min(n,m)$, there are row and column sets $$ \{ i_1, \ldots, i_r \}, \quad \{ j_1, \ldots, j_r \}, $$ such that the $r\times r$ submatrix $$ (\hat A_{s,t})_{s,t=1}^r = A_{i_s,j_t} $$ is nonsingular.
Of course, with the column rank of $A$ being $r$, we can find $r$ linearly independent columns of $A$. But how can we know that they stay linearly independent after restricting them to the $r$ similarly chosen rows of $A$?
Because the corresponding $r\times r$ minor is non-zero.