Prove linear independence of a set $\{\mathbf{x}-\mathbf{x_1},\ldots,\mathbf{x}-\mathbf{x_n}\}$

110 Views Asked by At

Let $V$ be a vector space and suppose that $\{\mathbf{x_1},\ldots,\mathbf{x_n\}}$ is a linearly independent subset of $V$. If $\mathbf{x} = \sum_{i=1}^n c_i\mathbf{x_i}$ where each $c_i \in \mathbb{R}$, prove that the set $\{\mathbf{x}-\mathbf{x_1},\ldots,\mathbf{x}-\mathbf{x_n}\}$ is linearly independent if and only if $\sum_{i=1}^n c_i \neq 1$


So I write a linear combination of the set as $a_1(x-x_1)+\dots +a_n(x-x_n)$ and I know that if this is zero than the $a_i$s must be equal to zero. Then I replace all $x$s from the definition and however I try to manipulate the expression I get to nowhere.

2

There are 2 best solutions below

0
On BEST ANSWER

Suppose $\sum_{i = 1}^n c_i \neq 1$. Let $$a_1(\mathbf{x - x_1}) + \cdots + a_n(\mathbf{x - x_n}) = \mathbf{0}$$ by a linear dependence relation for $\{\mathbf{x - x_1},\ldots,\mathbf{x - x_n}\}$. It is equivalent to the equation

$$A\mathbf{x} = \sum_{i = 1}^n a_i\mathbf{x_i},$$

where $A = \sum_{i = 1}^n a_i$. Since $\mathbf{x} = \sum_{i = 1}^n c_i\mathbf{x_i}$, the left hand side of the equation is the same as $\sum_{i = 1}^n Ac_i\mathbf{x_i}$. Thus $$\sum_{i = 1}^n Ac_i\mathbf{x_i} = \sum_{i = 1}^n a_i \mathbf{x_i},$$ which implies $Ac_i = a_i$ for all $i$ (by linear independence of $\{\mathbf{x_1},\ldots, \mathbf{x_n}\}$). Thus $\sum_{i = 1}^n Ac_i = \sum_{i = 1}^n a_i$, i.e., $A\sum_{i = 1}^n c_i = A$. Since $\sum_{i = 1}^n c_i \neq 1$, we must have $A = 0$. Therefore, $a_i = Ac_i = 0c_i = 0$ for all $i$. This shows that $\{\mathbf{x - x_1},\ldots, \mathbf{x - x_n}\}$ is linearly independent.

Now suppose $\sum_{i = 1}^n c_i = 1$. Then $$\sum_{i = 1}^n c_i(\mathbf{x - x_i}) = \left(\sum_{i = 1}^n c_i\right) \mathbf{x} - \sum_{i = 1}^n c_i\mathbf{x_i} = \mathbf{x} - \sum_{i = 1}^n c_i \mathbf{x_i} = \mathbf{0},$$

showing that the set $\{\mathbf{x - x_1},\ldots, \mathbf{x - x_n}\}$ is linearly dependent.

1
On

$\def\v#1{{\bf#1}}$ If $\sum c_i=1$ then $\{\v x-\v x_1,\ldots,\v x-\v x_n\}$ is dependent because $$\sum_{i=1}^n c_i(\v x-\v x_i) =\Bigl(\sum_{i=1}^nc_i\Bigr)\v x-\Bigl(\sum_{i=1}^n c_i\v x_i\Bigr) =\v x-\v x=\v 0\ ,$$ and the coefficients $c_i$ are obviously not all zero because their sum is $1$.

Suppose on the other hand that $\sum c_i\ne1$ and that $\sum\alpha_i(\v x-\v x_i)=\v 0$. If we substitute $\v x$ in terms of $\v x_i$ and find the coefficient of $\v x_1$ we get $$\alpha_1c_1+\cdots+\alpha_nc_1-\alpha_1\ ,$$ and since the vectors $\v x_i$ are independent this must be zero. Do the same for the other $\v x_i$ and add the resulting equations to get $$(\alpha_1+\cdots+\alpha_n)(c_1+\cdots+c_n-1)=0\ .$$ Since $c_1+\cdots+c_n\ne1$ we have $\alpha_1+\cdots+\alpha_n=0$. Then $$\sum\alpha_i(\v x-\v x_i)=\v 0$$ becomes $$\sum\alpha_i\v x_i=\v 0\ ,$$ and since the $\v x_i$ are independent, the scalars must be zero. This completes the proof.