I was reading this answer to the problem of showing $\operatorname{rank}(\operatorname{adj}A)\in\{0,1,n\}$. In the first case we see:
If $A$ has rank less than or equal to $n - 2$ then any $n - 1$ columns of $A$ are linearly dependent and in particular, any $(n-1) \times (n-1)$ submatrix of $A$ has linearly dependent columns.
I can't understand why is that correct. Is there an elementary proof to this?
Is it because for checking that those $n-1$ vectors of size $n$ are linearly dependent, we solve a $n-1$ variables and $n$ equations system and we find a non-zero answer, solving any $n-1$ variables and $n-1$ equations system from the first linear system of equations must also have a non-zero answer?
If you have some relation between the vectors, i.e. some non-trivial linear combination equating zero, the same combination will stay equal to zero when you remove a coordinate.
In other words, suppose that $v_1,v_2,\ldots,v_{n-1}$ are $n-1$ vectors in ${\mathbb R}^n$, and suppose that they are linearly dependent i.e. we have $\lambda_1v_1+\ldots+\lambda_{n-1}v_{n-1}=0$ for some numbers $\lambda_1,\ldots,\lambda_n$ not all zero. If you denote by $p$ the linear projection ${\mathbb R}^n \to {\mathbb R}^{n-1}$ which removes the $i$-th coordinate, then you have
$$ 0=p(0)=p(\lambda_1v_1+\ldots+\lambda_{n-1}v_{n-1})= \lambda_1p(v_1)+\ldots+\lambda_{n-1}p(v_{n-1}) $$
so the vectors $p(v_1),p(v_2),\ldots,p(v_{n-1})$ are linearly dependent also.