A bit of confusion about linearly independent vectors

205 Views Asked by At

Suppose I have those vectors

$$v = (1, 0, 1) \qquad u = (2, -1, 0) \qquad w = (0, 0, 1) \qquad s = (2, 1, 1)$$

Now those are vectors in $\mathbb{R}^3$. If I want to study their linearl independence I can write down the associated matrix, that is

$$M= \begin{pmatrix} 1 & 0 & 1 \\ 2 & -1 & 0 \\ 0 & 0 & 1 \\ 2 & 1 & 1 \end{pmatrix} $$

Through Gauss elimination or the criterion of minors, I can state that the rank is three.

What confuses me now is that the notes say "vectors are linearly independent if the rank of the associated matrix is equal to the number of the unknown".

But again then it says "four or more vectors in $\mathbb{R}^3$ are linearly dependent."

So now I am confused, for the rank of the matrix is three, as the unknown. Yet I have four vectors. Can somebody please explain me limpidly this?

Also, another question: suppose I build the associated matrix where the vectors are the columns and not the rows. Here the rank of the matrix is three, the maximum. But if the rank is maximum then they are linearly independent.

What is the difference of creating the row matrix and the column matrix in this sense?

Thank you!

3

There are 3 best solutions below

4
On BEST ANSWER

Doesn't your book tell you what it calls "the associated matrix"?

Its sentence "vectors are linearly independent if the rank of the associated matrix is equal to the number of the unknown" is correct with the usual notion of a matrix associated to $n$ vectors: it has $n$ columns (in your example, each of your 4 vectors should be written vertically, and the associated matrix is the transpose of your $M$).

I don't follow your "if the rank is maximum then they are linearly independent". The rank of the matrix $M^T$ (of 4 column vectors) is equal to the rank of this family of 4 vectors, i.e. the dimension of the subspace they span. Since this rank of $M^T$ (equal to the rank of $M$) is strictly less than this number of vectors, they are dependent.

0
On

About the first question, I believe the notes are half wrong: the part in which it says "four or more vectors in $\mathbb{R}^3$ are linearly dependent" is correct.

This is indeed a consequence of Steinitz exchange lemma: for a vector space $ V $, if $ g $ is a linearly independent set and $ s $ is a spanning set, then we have $ |g| \leq |s| $. Since we may take $ s $ to be a basis of $ V $, this implies that we must have $ |g| \leq \text{dim}( V) $.

The wrong part is that "vectors are linearly independent if the rank of the associated matrix is equal to the number of unknown". This is confusing to me, and indeed in your example this is false for the unknown are three, the rank is three but for what I said above they cannot be independent.

Also imagine it in this way: take the canonical basis of $\mathbb{R}^3$, id est the vectors $e_i = (1, 0, 0)$, $e_j = (0, 1, 0)$, $e_k = (0, 0, 1)$.

It is rather obvious that any other three dimensional vector $t = (a, b, c)$ will be a linear combination of the canonical basis. If this were not then $\mathbb{R}^3$ would have dimension greater than three, which is of course absurd.

0
On

"Given vectors are linearly independent if the rank of the associated matrix is equal to the number of these given vectors" is the correct statement. The number of unknowns (the unknown) are related to the system of linear equations. Is the number of the given vectors meant to be the unknown here?

"Four or more vectors given in $\Bbb{R}^3$ are linearly independent": True. Because then the associated matrix is $n\times 3$ where $n\geq 4$ and the rank of this matrix rank is at most $3$. Since 4>3, $n$, the number of the given vectors is greater than the rank but not equal.

Do not create a matrix where the vectors are columns if you are not asked or you have a special purpose. (After row operations you ruin the subspace created although you can find this subspace's dimension correctly.) But you are right, the rank of this matrix is always equal to the rank of the associated matrix. So, it can be used for testing the linear independence too.