Solve a system of equations where the coefficients contain variables

19 Views Asked by At

Given a set $S = \{(1, -1, 0), (2, 1, 4), (3, 0, 4)\}$, we are supposed to show that the set is either linearly independent or dependent.

The straightforward way to show that it's indeed linearly dependent is $(1, -1, 0) + (2, 1, 4) = (3, 0, 4)$.

But I was wondering what to do if there are more vectors and it's not so obvious that one vector can be expressed in terms of the others. I was thinking we could set up the following system:

$$ \alpha + 2\beta + 3\gamma = 0 \\ -\alpha + \beta = 0 \\ 4\beta + 4\gamma = 0 $$

Solving this I found that $\alpha = \beta$ and $\gamma = -\beta$. Plugging that into the first equation I obtained $\alpha + 2\alpha - 3\alpha = 0$, which is true - but doesn't really help me to show that I can find $\alpha, \beta, \gamma$ that are not all zero/or all all zero to prove what I wanted. I'm more used to solving systems where all the coefficients are on the RHS.

Somehow I got stuck here - how do you show that a set is linearly dependent/independent in more complex cases?

1

There are 1 best solutions below

2
On BEST ANSWER

After having got that $\alpha=\beta$ and that $\gamma=-\beta$, you can tkae, say $\beta=1$. Then $\alpha=1$ and $\gamma=-1$, and so you know from your computations that$$(1,−1,0)+(2,1,4)-(3,0,4)=(0,0,0).$$So, your vectors are linearly dependent.

Another approach consist in computing the determinant$$\begin{vmatrix}1&2&3\\-1&1&0\\0&4&4\end{vmatrix}.$$Since it's $0$, your vectors are linearly dependent. If it was not $0$, your vectors would be linearly independent.