For a square matrix, row vectors are linearly independent if and only if columns are.

2.2k Views Asked by At

Is there a way of proving this from ground up? namely using brute force computation entries by entries to show this?

2

There are 2 best solutions below

2
On BEST ANSWER

Maybe this is what you mean with "from ground up"

If the rows of $A$ are linearly independent, then doing row-reduction to $A$ gives the dentity matrix, so the only solution of $Av=0$ is $v=0$.

If the columns of $A$ are linearly dependent, say, $$a_1c_1+a_2c_2+\cdots+a_nc_n=0$$ where the $c_i$ are the columns and the $a_i$ are not all zero, then $Av=0$ where $$v=(a_1,a_2,\dots,a_n)\ne0$$

So, if the columns are dependent, then so are the rows.

Now apply the same argument to $A^T$ to conclude that if the rows of $A$ are dependent then so are the columns.

0
On

Let $K$ be a field. $\renewcommand\Im{\operatorname{Im}}$ Consider a $n\times m$-matrix as a $K$-linear mapping $\varphi:K^m\to K^n$. The column rank is the dimension $\dim_K(\Im\varphi)$, while the row rank is the dimension $\dim_K(\Im\varphi^\ast)$ of the transpose $\varphi^\ast$ of $\varphi$. Recall that we have a $K$-linear isomorphism \begin{align} &(\Im\varphi)^\ast\xrightarrow\sim\Im\varphi^\ast& &\lambda\mapsto\lambda\circ\varphi \end{align} which implies \begin{align*} \dim(\Im\varphi)&=\dim(\Im\varphi)^\ast\\ &=\dim(\Im\varphi^\ast) \end{align*} thus proving our assertion.