Proofs that the zero vector is not linearly independent

240 Views Asked by At

Re-checking my linear algebra, using Lax textbook Linear algebra and its applications.

I want to know if my attempt of proving the following theorem it's ok (it has been a while since I attended this lecture).

Show that if the vectors $x_1, x_2,\dots,x_j$ are linearly independent, then none of them is the zero vector.

As far as I understand, this is a proof by contradiction. I have to exhibit a nontrivial linear combination that fulfills

$$ k_1x_1+k_2x_2+\dots+k_jx_j=0, $$

where $k_j\in K$ are elements in the field $K$.

Without loss of generality, suppose that $x_1$ is the zero vector. Then

$$ k_1x_1+\dots+k_jx_j=0, $$

where all the $k_i$ must be zero. We can sum on both sides the term $\tilde{k}x_1$ (where $\tilde{k}\in K$), and by Eq. (10): $0x_1=0$. Therefore

$$ (\tilde{k}+k_1)x_1+\dots+k_jx_j=\tilde{k}x_1+\dots+k_jx_j=0, $$

and I have showed that there is a non-trivial linear combination with the zero vector, which is a contradiction.

3

There are 3 best solutions below

3
On

Unfortunately, as it stands, this is not a good proof.

  • You don't every introduce your assumption. You are assuming that the set is linearly independent, and that is why you can conclude something about $k_i$.

  • You don't say what $\overline{k}$ is; you need $\overline{k}$ to be non-zero, or else you've concluded nothing.

  • The proof is pretty circuitous and comes to the correct conclusion in a very roundabout way.

The key idea is just your last line: exhibit a combination that has a non-zero coefficient. Taking $k_1 = 1$ and $k_i = 0$ for $i > 1$ is good enough. And this gives a proof by contrapositive, which is quite a bit cleaner than the contradiction proof you were going for.

0
On

Let $V$ be a vector space over $K$. I guess it suffices to prove that if $v$ is the zero vector, then $\lambda v = 0_{V}$ for all $\lambda \in K$. Then, for $v_1, ..., v_n \in V$, if $v_i$ is the zero vector: $$\lambda_1 v_1 + ... + \lambda_i v_i + ... + \lambda_n v_n = 0_V $$

is a non-trivial combination for $ \lambda_1 = ... = \lambda_{i-1} = \lambda_{i+1} = ... = \lambda_n = 0_K $ and any non-zero $\lambda_i \in K$.

To see that $\lambda 0_v = 0_V$ for all $\lambda$, see that $\lambda 0_V = \lambda (0_V + 0_V) = \lambda 0_V + \lambda 0_V$. Applying the inverse of $\lambda 0_V$ in both sides of the equation - which exists by the axioms of vector spaces - we get $\lambda 0_V = 0_V$.

2
On

As far as I understand, this is a proof by contradiction. I have to exhibit a nontrivial linear combination that fulfills

$$ k_1x_1+k_2x_2+\dots+k_jx_j=0, $$

where $k_j\in K$ are elements in the field $K$.

So far so good.

Without loss of generality, suppose that $x_1$ is the zero vector. Then

$$ k_1x_1+\dots+k_jx_j=0, $$ where all the $k_i$ must be zero.

No, no, stop the bus here. You just correctly stated your goal: you have to exhibit the $k_i$. That is: you have to come up with explicit values to assign to them. In this case set $k_1 = 1$, $k_2 = k_3 = \cdots = k_j = 0$ and you are done. Then the contradiction is that this is a nontrivial linear combination, contradicting the assumed linear independence.

This is actually more cleanly done by the proving the contrapositive: "If one of $x_i$ is $0$ then $\{x_1, \ldots, x_j\}$ is not a linearly independent set." Then you can give a direct proof of that statement rather than having to come up with a contradiction. The contrapositive is always equivalent to the original statement.