Why is it the case that when the columns of a matrix are dependent, then the rows of the matrix are dependent also?

69 Views Asked by At

The question is self-explanatory. But as an example (this is from Strang's lectures), take the following Markov matrix M.

$M=\begin{bmatrix} .1 & .01 & .3 \\ .2 & .99 & .3 \\ .7 & 0 & .4 \end{bmatrix}$

Here, the columns are clearly dependent since each adds up to 1. However, in the lectures Strang mentions the fact that this implies that the rows are also linearly dependent. Could someone explain why that is the case in general?