How to test if $m$ vectors are linearly dependent when they are $n$ dimensional and $m < n$

369 Views Asked by At

I'll be shocked if this isn't a duplicate, but I haven't had a lot of luck finding an answer to this so far.

How do you test if a set of vectors $v_1, \ldots v_m \in \mathbb{R}^n$ are linearly independent when $m < n$?

I know that when you have $m = n$ you can take the determinant of their matrix, and when $m > n$ the system is over-determined and the are always dependent. But what about for under-determined systems? Padding with $0$s will create an artificial dependence, so that won't work.

An alternate plan would be to find the subspace of $\mathbb{R}^n$ that the vectors span and test the dimensionality of the space, but I don't know how to actually do that.

1

There are 1 best solutions below

1
On BEST ANSWER

Put them as columns in a matrix $A$. Then, they are linearly independent if and only if the null space of $A$ is $\{ 0\}$. Alternatelly, they are linearly independent if and only if the Reduced Row Echelon Form of $A$ has $m$ leading ones.