What does it mean for vectors of a matrix to be linearly dependent?

144 Views Asked by At

I'm studying matrices and the implications of the determinant being $0$. I've read that if the determinant of a transformation matrix is $0$, then the vectors in the rows or columns are "linearly dependent", and I keep searching trying to find another definition, because I have no idea what this means.

A) What does linearly dependent mean?

Also, when the co-efficient matrix determinant equals $0$, then there is a non-unique solution. I can sit down and calculate it on paper so you might get something like $(t, 4, t -4)$, but B) what does it actually mean visually if you have a non-unique solution?

1

There are 1 best solutions below

3
On BEST ANSWER

A set of vectors $v_1,\dots,v_n$ are linearly dependent if you can find scalars $c_1,\dots,c_n$ such that $c_1v_1+\cdots+c_nv_n=0$ and at least one of the scalars itself is not zero.

You need to think of square a matrix $M$ as a function from $\Bbb R^n$ to $\Bbb R^n$. It takes $v\in\Bbb R^n$ to $M\cdot v$. Then the determinant is non-zero if and only if this function 1-1 and onto (and therefore has an inverse function). The best intuition for this matrix stuff is to always think of them as functions.

Since the image of this function is a subspace of $\Bbb R^n$, if it is not invertible then it is not 1-1 and therefore it is many-to-one for any point in the image, and zero-to-one for any point not in the image. Like for example if you map $\Bbb R^3$ to $\Bbb R^3$ but the image lands in the $x-y$ plane ($\Bbb R^2$), then it must collapse a dimension and therefore be many-to-one (infinite-to-one really). That's why if the determinant is zero there will always be infinitely many things that map to the zero vector, and therefore many solutions to the associated system of equations.

Does that help?