Dimension of space spanned by vectors subtracted by a linear combination of them

35 Views Asked by At

I've encountered a problem similar to this one: Linearly independent vectors each subtracted by a linear combination of them are linearly dependent if coefficients add up to 1

To restate it for convenience here, we're given $n$ linearly independent vectors $v_1, \ldots, v_n$ of an $\mathbb{F}$-vector space $V$ and a linear combination $u$ of them: $u = \lambda_1 v_1 + \ldots \lambda_nv_n$. The goal was to show that

The vectors $v_1 - u, \ldots, v_n - u$ are linearly dependent if and only if $\lambda_1 + \ldots + \lambda_n = 1$.

(The answers proved this equivalence holds).

I'd like to ask a follow-up question:

What is the dimension of the space spanned by the vectors $v_1-u, \ldots, v_n-u$, $\dim (\text{span} (\{v_1-u, \ldots, v_n-u\}))$?

We know $\dim (\text{span} (\{v_1-u, \ldots, v_n-u\}))\leq n -1$, since they are linearly dependent, but can we find a lower bound (that possibly depends on the $\lambda_i$'s)?

I've tried looking at what happens in some simple cases, namely when $u = v_i$ for some $i$ and when $u$ is the mean of $v_1, \ldots, v_n$, $u = \frac{1}{n} \sum_{i=1}^{n} v_i$. However, I am utterly stuck.

I'd greatly appreciate any hints on approaching this problem.

1

There are 1 best solutions below

0
On

Well, let $X:=\{\sum_{k=1}^n a_k(v_k-u): a_k\in\mathbb{F}\}$ be the linear span of the new vectors. If $x\in X$, then $x=\sum_{k=1}^n a_k(v_k-u)=\sum_{k=1}^n a_k v_k -\big(\sum_{k=1}^n a_k\big) u$.

So if we start with an arbitrary $y\in V$, we know we can find coefficients $a_k$ such that $y=\sum_{k=1}^n a_k v_k$. Let $a:=\sum_{k=1}^n a_k$. Then $y-au=\sum_{k=1}^n a_k v_k - au \in X$. This means that $V\subset X+\langle u\rangle$, where $\langle u\rangle $ is just the linear span of the vector $u$. So the dimension of $X$ must be at least $n-1$.