Determine if the set of vectors are linearly independent or linearly dependent

13.5k Views Asked by At

$$u=(1,1,1,3)$$ $$v=(1,2,1,3)$$ $$w=(1,2,3,2)$$

I need help understanding the method of how to do solve this type of problem. I understand that the concept is just to find out if the constants $k_n$ in $k_1u+k_2v+k_3w=0$ all equal zero or not.

Linear independence : this means that $k_1$, $k_2$, and $k_3$ are all equal to zero and that these are the only values that will make the overall equation equal to zero.

Linear dependence : this means that at least one of the $k$ values is not equal to zero.

This is how I tried solving this:

  1. Construct an augmented matrix $A$ out of the vectors.

$$A = \left[\begin{array}{ccc|c}1 & 1 & 1 & 0\\1 & 2 & 2 & 0\\ 1 & 1 & 3 & 0 \\ 3 & 3 & 2 & 0\end{array}\right]$$

  1. Use row operations as much as possible to get to row-echelon form

$$A = \left[\begin{array}{ccc|c}1 & 1 & 1 & 0\\0 & 1 & 1 & 0\\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0\end{array}\right]$$

  1. We now have the following:

$$a_1+a_2+a_3=0$$ $$a_2+a_3=0$$ $$a_3=0$$

This is where I'm confused. I don't know what these $a$ values are supposed to represent. They can't be the vector values because we had those then changed them. Are they the $k$ values I'm looking for?

I would conclude that this system of equations is linearly independent but that's assuming I even understand what the $a_n$ values are...

2

There are 2 best solutions below

0
On

First, note that augmenting with $0$'s is superfluous, as row operations won't change a zero-column.

It would be useful to find the reduced row-echelon form of $A$. In this case $$ \DeclareMathOperator{rref}{rref}\rref A= \left[\begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{array}\right] $$ This tells us that any scalars $a_1,a_2,a_3$ satisfying $$ a_1 u+a_2 v+a_3 w=\vec 0 $$ must also satisfy $$ a_1=a_2=a_3=0 $$ This is exactly the statement that $u,v,w$ are linearly independent!

0
On

To answer where those $a$'s come from, and what they represent, when you construct an augmented matrix like you did, what you really have is,

$$[A|0] = \left[\begin{array}{ccc|c} 1 & 1 & 1 & 0 \\ 1 & 2 & 2 & 0 \\ 1 & 1 & 3 & 0 \\ 3 & 3 & 2 & 0 \\ \end{array} \right].$$

Which is equivalent to saying, $$A\vec{x} = \vec{0},$$

or,

$$\left[\begin{array}{ccc} 1 & 1 & 1 \\ 1 & 2 & 2 \\ 1 & 1 & 3 \\ 3 & 3 & 2 \\ \end{array} \right] \left[\begin{array}{c} a_{1} \\ a_{2} \\ a_{3} \end{array} \right] = \left[\begin{array}{c} 0 \\ 0 \\ 0 \end{array} \right], $$

which describes the system,

$$\begin{align} 1\cdot a_{1} + 1\cdot a_{2} + 1\cdot a_{3} &= 0\\ 1\cdot a_{1} + 2\cdot a_{2} + 2\cdot a_{3} &= 0\\ 1\cdot a_{1} + 1\cdot a_{2} + 3\cdot a_{3} &= 0\\ 3\cdot a_{1} + 3\cdot a_{2} + 2\cdot a_{3} &= 0\\ \end{align}$$

But once row reduced, $$\begin{align} 1\cdot a_{1} + 1\cdot a_{2} + 1\cdot a_{3} &= 0\\ 0\cdot a_{1} + 1\cdot a_{2} + 1\cdot a_{3} &= 0\\ 0\cdot a_{1} + 0\cdot a_{2} + 1\cdot a_{3} &= 0\\ 0\cdot a_{1} + 0\cdot a_{2} + 0\cdot a_{3} &= 0.\\ \end{align}$$

Which is just

$$\begin{align} 1\cdot a_{1} + 1\cdot a_{2} + 1\cdot a_{3} &= 0\\ 1\cdot a_{2} + 1\cdot a_{3} &= 0\\ 1\cdot a_{3} &= 0.\\ \end{align}$$

But as Brian mentioned, continuing to RREF form we have $$\begin{align} 1 \cdot a_{1} &= 0\\ 1 \cdot a_{2} &= 0\\ 1 \cdot a_{3} &= 0. \end{align}$$

Thus the only solution to your original equation is the vector, $$ \left[ \begin{array}{c} a_{1} \\ a_{2} \\ a_{3} \end{array} \right] = \left[ \begin{array}{c} 0 \\ 0 \\ 0 \end{array} \right]$$

Which is the definition of linear independence.