$\{ b_i + v \}$ is a basis for $V$ if $v = \sum \alpha_i b_i$ such that $\sum \alpha_i \neq -1$

132 Views Asked by At

I've tested this with many examples and they seem to work. For $\sum \alpha_i = -1$, I've come up with a counterexample to a possible generalization.

My proof is as follows:

Let $V$ be a vector space with basis $\mathcal{B}$. Let $v \in V$ be a vector such that $\sum \alpha_i \neq -1$, that is, the sum of all its coefficients in a linear combination with respect to $\mathcal{B}$ is not $-1$. Then the set $\mathcal{B}_v$, defined as $\mathcal{B}_v =\{ b_i + v \mid b_i \in \mathcal{B} \}$ is a basis for $V$.

First, we prove that $\mathcal{B}_v$ is linearly independent. For that, suppose \begin{equation} \sum a_i b_{v_i} = 0, b_{v_i} \in \mathcal{B}_v \; . \end{equation} Then \begin{equation} 0 = \sum a_i b_{v_i} = \sum a_i (b_i + v) = \sum a_i b_i + \left( \sum a_i \right) v \end{equation} \begin{equation} \sum a_i b_i + \left( \sum a_i \right) v = \sum a_i b_i + \left( \sum a_i \right) \left( \sum \alpha_i b_i \right) = \sum \left( a_i + \alpha_i \sum a_i \right) b_i \end{equation} Linear independence in $\mathcal{B}$ implies $ a_i + \alpha_i \sum a_j = 0$ for all $i$. Adding the equalities for all $i$, we have \begin{equation} \sum a_i + \left( \sum \alpha_i \right) \left( \sum a_i \right) = \left( \sum \alpha_i + 1 \right) \left( \sum a_i \right) = 0 \end{equation} Since $\sum \alpha_i \neq -1$, the equality implies $\sum a_i = 0$ and so for each individual equality we have $ a_i + \alpha_i (0) = 0 \implies a_i = 0$ for all $i$. Hence, $\mathcal{B}_v$ is linearly independent.

Now, we prove $\mathcal{B}_v$ spans $V$. We take the coefficients $\alpha_i$ of $v$ and make a linear combination, but using the set $\mathcal{B}_v$: \begin{equation} \sum \alpha_i b_{v_i} = \sum \alpha_i b_i + \left( \sum \alpha_i \right) v = \left( \sum \alpha_i + 1 \right) v \; . \end{equation} Since $\sum \alpha_i \neq -1$, we have \begin{equation} \left( \sum \alpha_i + 1 \right)^{-1} \left( \sum \alpha_i b_{v_i} \right) = v \; . \end{equation} For any $b_{v_i} \in \mathcal{B}_v$, we have \begin{equation} b_{v_i} - \left( \sum \alpha_i + 1 \right)^{-1} \left( \sum \alpha_i b_{v_i} \right) = b_{v_i} - v = b_i \; , \end{equation} so every element of the original basis is in the span of $\mathcal{B}_v$, implying it spans all $V$.


This concludes the proof, but I'd also like to ask if my solution to an exercise in which I applied this result could also be verified (I apologize for the lengthy question but creating a separate one would be confusing without the explanation of my result).

The exercise is the following:

b) Prove that if a subspace S of $\mathbb{R}^n$ contains a strongly positive vector [ $v = (a_1, ..., a_n)$ where $a_i > 0$ ], then $S$ has a basis of strongly positive vectors.

My solution: Let $v$ the strongly positive vector, and $\mathcal{C}$ a basis for $S$. Since $v \in S$, $\lambda v \in S$. We find the vector in $\mathcal{C}$ with the smallest coordinate of all, $\epsilon_c$. We also take the smallest coordinate of $v$, $\epsilon_v$. First, we multiply $v$ by $\epsilon_v^{-1}$, so that the smallest coordinate of v is $1$. Then we multiply $v$ again by $\vert \epsilon_c \vert + 1$, so that the smallest coordinate in $v$ suffices to make $\epsilon_c$ positive in an eventual sum of vectors, in the worst case possible. The final step is optional, and must be applied only if the sum of coefficients of $v$ with respect to $\mathcal{C}$ is $-1$. If so, we multiply $v$ by $\lambda > 1$, which does not make $v$ smaller and changes the sum of coeffients to a different value. Now, we create a set $\mathcal{C}_v =\{ c_i + v \mid c_i \in \mathcal{C} \}$, and with the above result we have that $\mathcal{C}_v$ is a basis for $S$, and entirely strongly positive, since $v$ can make the smallest coordinate greater than $0$ in the summations that occur.


Are there any mistakes in the steps of my proof? What about the exercise? Even if the result is true, is my solution correct?

Any help is appreciated.