Constructing a new basis

76 Views Asked by At

Let $B=\{v_1,\dots,v_n\}$ be a basis for a vector space $V$. Show that $A=\{v+v_1,\dots,v+v_n\}$, where $v=\sum_{i=1}^n a_iv_i$ is a basis for $V$ if and only if $\sum_{i=1}^n a_i\neq -1$.

Of course it is enough to show that the set $A$ is linearly independent. I tried to assume that there is a nontrivial linear combination of $0$ with elements of $A$ and assuming that $\sum_{i=1}^n a_i= -1$ to get a contradiction. For the case $n=2$ this works well but in general I am getting quite confused by all the equations and indices. For the other direction I have no good idea.

3

There are 3 best solutions below

0
On BEST ANSWER

If you already know that the coefficients to a fixed basis is unique, then you can obtain a pretty short proof:

If $\sum_i a_i = -1$, then $A$ is linearly dependent, as $ \sum_i a_i (v+ v_i) = -v + v = 0$.

Assume $A$ to be linearly dependent now. There exists a non-trivial linear combination $$ 0 = \sum_j \lambda_j (v+v_j) $$ of zero. Notice that $L := \sum_j \lambda_j \ne 0$, as $B$ is a basis and some of the $\lambda_j$'s is non-zero. It follows $$ \sum_i a_i v_i = v = -\frac1L \sum_j \lambda_j v_j$$ and by the uniqueness of the coefficients we have $a_i = -\lambda_i / L$ and $\sum_i a_i = -1$.

0
On

First, I believe that this should read "$\sum_{i=1}^na_i\neq -1$." For instance if $v_1=(1,0)$ and $v_2=(0,1)$ with $a_1=1$ and $a_2=0$ then $v=(1,0)$ and $\{v+v_1=(2,0),v+v_2=(1,1)\}$ is a basis. Suppose that $\sum_{i=1}^na_i\neq-1$. Then for arbitrary $\lambda_i$ with $\Lambda=\sum_{i=1}^N\lambda_i$ such that $\sum_{i=1}^n\lambda_i(v+v_i)=0$:

$$0=\sum_{i=1}^n\lambda_i(v+v_i)=\sum_{i=1}^n\lambda_iv+\sum_{i=1}^n\lambda_iv_i=\sum_{i=1}^n\lambda_i(\sum_{j=1}^na_jv_j)=\Lambda\sum_{i=1}^n a_iv_i+\sum_{i=1}^n\lambda_iv_i=\sum_{i=1}^n(\Lambda a_i+\lambda_i)v_i$$

Since the $B$ is a basis, $\Lambda a_i+\lambda_i=0$. Summing over $i$, we see that $\sum_{i=1}^n\Lambda a_i+\lambda_i=0$ or that $\Lambda(\sum_{i=1}^na_i)+\Lambda=0$. Since $\sum_{i=1}^na_i\neq-1$, we see that $\Lambda=0$ and from the expression $\Lambda a_i+\lambda_i=0$ we see that $\lambda_i=0$ for all $i$ as desired.

Conversely, suppose that $A$ is a basis and that $a_i\neq0$ for all $i$ (since then we are done). Then we know that $\sum_{i=1}^na_i(v+v_i)\neq0$ so by the above computation, for some $i$ we must have that $(\sum_{i=1}^na_i)a_i+a_i\neq0$ (we perform the above expansion with $\lambda_i=a_i$). It follows that $\sum_{i=1}^na_i\neq0$.

0
On

The matrix having as columns the coordinates of the vectors in $A$ with respect to the basis $B$ is $$ M=\begin{bmatrix} a_1+1 & a_1 & \dots & a_1 \\ a_2 & a_2+1 & \dots & a_2 \\ a_3 & a_3 & \dots & a_3 \\ \vdots & \vdots & \ddots & \vdots \\ a_n & a_n & \dots & a_n+1 \end{bmatrix} $$ and $A$ is a basis if and only if $M$ is invertible.

If $$ C=\begin{bmatrix} a_1 & a_1 & \dots & a_1 \\ a_2 & a_2 & \dots & a_2 \\ a_3 & a_3 & \dots & a_3 \\ \vdots & \vdots & \ddots & \vdots \\ a_n & a_n & \dots & a_n \end{bmatrix} $$ then $M=C+I$. Note that $C-\lambda I$ is invertible if and only if $\lambda$ is not an eigenvalue of $C$.

Assuming that not all the $a_i$ are zero, $C$ has rank $1$; therefore it has $0$ as an eigenvalue of multiplicity at least $n-1$ and $a_1+a_2+\dots+a_n$ as an eigenvalue of multiplicity $1$, because $$ \begin{bmatrix} 1 & 1 & \dots & 1 \end{bmatrix} C= (a_1+a_2+\dots+a_n)\begin{bmatrix} 1 & 1 & \dots & 1 \end{bmatrix} $$ and the eigenvalues of $C$ are the same as the eigenvalues of $C^T$.

Thus $M$ is invertible if and only if $-1$ is not an eigenvalue of $C$, that is, if and only if $a_1+a_2+\dots+a_n\ne-1$.


As a counterexample for your statement with $1$ instead of $-1$, consider $a_1=1$, $a_2=a_3=\dots=a_n=0$. The sum is $1$, but clearly $\{2v_1,v_2,\dots,v_n\}$ is a basis for $V$ (provided the characteristic of the base field is $\ne2$, but in that case $-1=1$).