Prove a set S of vectors is linearly independent

85 Views Asked by At

I saw the following question on an exam and my teacher told me my proof is wrong because "these are not the same scalars".

Let $ v_1, v_2, ..., v_k, w $ be different vectors in a linear space over $ \mathbb{R} $.

Suppose $ w \notin Sp\{v_1 - w, v_2 - w, \ldots, v_k -w\} $.

Prove that if the set $ \{v_1 - w, v_2 - w, \ldots, v_k - w\} $ is linearly independent, then the set $ \{v_1, \ldots, v_k\} $ is also linearly independent.

Can you verify my proof?

Pf:

Suppose $ \{v_1, \ldots, v_k\} $ is linearly dependent. Therefore, there exist $ \lambda_1, \ldots \lambda_k $ such that $ \{\lambda_1, \ldots, \lambda_k\} \neq \{0\} $ and

$$ \lambda_1 v1 + \ldots + \lambda_k v_k = 0 $$

$ \{v_1 - w, \ldots, v_k - w\} $ is linearly independent and therefore

* $ \lambda_1(v_1 - w) + \ldots + \lambda_k(v_k - w) \neq 0 $

** Assume $ \lambda_1(v_1 - w) + \ldots + \lambda_k(v_k - w) = u $.

Therefore, $ \lambda_1v_1 - \lambda_1w + \ldots + \lambda_kv_k - \lambda_k w = u $.

$ \lambda_1v_1 + \ldots + \lambda_kvk - (\lambda_1 + \ldots + \lambda_k)w = u \Rightarrow \lambda_1v_1 + \ldots + \lambda_kv_k - u = (\lambda_1 + \ldots \lambda_k)w $.

We will look at 2 cases:

If $ \lambda_1 + \ldots + \lambda_k \neq 0 $, then

$ w = \frac{\lambda_1v1}{\lambda_1 + \ldots + \lambda_k} + \ldots + \frac{\lambda_kv_k}{\lambda_1 + \ldots + \lambda_k} - \frac{u}{\lambda_1 + \ldots + \lambda_k}$.

$ u \in Sp\{v_1 - w, \ldots, v_k - w \} $ and therefore the expression above is an element of $ Sp\{v_1 - w, \ldots, v_k - w\} $ and therefore $ w \in Sp\{v_1 - w, \ldots, v_k - w\} $ - contradiction.

Assume $ \lambda_1 + \ldots + \lambda_k = 0 $, therefore, $ \lambda_1v_1 + \ldots \lambda_kv_k - u = 0 \Rightarrow \lambda_1v_1 + \lambda_kv_k = u $, and therefore, $ u = 0 $, but according to (*) and (**), $ u \neq 0 $ - contradiction.

Therefore, the set is linearly independent.

1

There are 1 best solutions below

0
On

First, if $w=0$ the theorem is trivial, so assume $w\neq 0$

Second, for linear dependence of the vectors $\{v_1, ...v_k\}$ it is sufficient that in:

$\beta_1 v_1+...+\beta_k v_k=0$

At least on of the $\beta_i$ is different from 0.

$(\exists j)(1\leq j\leq k)(\beta_j\neq0)$

Assume wlog $j=1$ (you can always rearrange a finite set).

Thus: $v_1=-\frac{1}{\beta_1}(\beta_2 v_2+...+\beta_k v_k)$ $v_1\neq 0\implies (\exists j)((2\leq j\leq k)\land \beta_j\neq 0)$

Now use the hypothesis:

$\alpha_1(v_1-w)+...+\alpha_k(v_k-w)=0\implies (\forall i)(\alpha_i=0)$

But this in turn implies:

$\alpha_1(-\frac{1}{\beta_1}(\beta_2 v_2+...+\beta_k v_k)-w)+\alpha_2(v_2-w)...+\alpha_k(v_k-w)=0$

And can be rewritten as:

$(-\frac{1}{\beta_1}\beta_2\alpha_1+\alpha2)v_2+...+(-\frac{1}{\beta_1}\beta_k\alpha_1+\alpha_k)v_k-(\alpha_1+...+\alpha_k)w=0$

Again assume wlog $\beta_2\neq 0$. Then

$\alpha_2=\frac{\beta_2}{\beta_1}\alpha_1$

contradiction.