Finding an error in proof for an axiom of a vector space

95 Views Asked by At

Recently, I asked a question about axoims of vector spaces and counterexamples. The question can be found at: A Counter Examples in Linear Algebra (Vector Space)

However, a few days before asking this question, I tried to prove that the axiom $1 \cdot v = v$ in the definition of a vector space is redundant. For the proof, I have used that every vector space has a basis (which is proved by using Zorn's Lemma). However, I know that this is not true. Because we do have a system where every other axiom is satisfied and only $1 \cdot v \neq v$ and hence the system fails to be a vector space. I would like to know where exactly in the proof have I made a mistake? I have been stuck on this for a few weeks and no matter what I do, I cannot find the exact point where things start to go wrong. All help, suggestions and comments are appreciated. The proof I tried is as follows:-

We know that $V$ must have a basis (can be proved by using Zorn's Lemma), say $B$. Let $v \in V$. Then, $\exists v_1, v_2, \cdots, v_n \in B$ and $\alpha_1, \alpha_2, \cdots, \alpha_n \in \mathbb{F}$ such that $v = \alpha_1 \cdot v_1 + \alpha_2 \cdot v_2 + \cdots + \alpha_n \cdot v_n = \sum\limits_{i = 1}^{n} \alpha_i \cdot v_i$. Now, let $1 \cdot v = w$, where $1 \in \mathbb{F}$ is the unity of the field and $w \in V$. Again, since $B$ is a basis, $\exists w_1, w_2, \cdots, w_m \in B$ and $\beta_1, \beta_2, \cdots, \beta_m \in \mathbb{F}$ such that $w = \beta_1 \cdot w_1 + \beta_2 \cdot w_2 + \cdots + \beta_m \cdot w_m = \sum\limits_{i = 1}^{m} \beta_i \cdot w_i$. Therefore, we have \begin{align*} 1 \cdot \sum\limits_{i = 1}^{n} \alpha_i \cdot v_i &= \sum\limits_{i = 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{n} 1 \cdot \left( \alpha_i \cdot v_i \right) &= \sum\limits_{i = 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{n} \left( 1 \alpha_i \right) \cdot v_i &= \sum\limits_{i = 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{n} \alpha_i \cdot v_i &= \sum\limits_{i = 1}^{m} \beta_i \cdot w_i \\ \end{align*} Consider the two sets $S_1 = \left\lbrace v_1, v_2, \cdots, v_n \right\rbrace$ and $S_2 = \left\lbrace w_1, w_2, \cdots, w_m \right\rbrace$. Clearly, $S_1 \subseteq B$ and $S_2 \subseteq B$ and hence $S_1, S_2$ are linearly independent. Also, the set $S_1 \cup S_2 = \left\lbrace v_1, v_2, \cdots, v_n, w_1, w_2, \cdots, w_m \right\rbrace \subseteq B$ and is also linearly independent. Let, if possible $S_1 \cap S_2 = \emptyset$, where $\emptyset$ denotes the empty set. This means that $\forall i \in \left\lbrace 1, 2, \cdots n \right\rbrace$ and $\forall j \in \left\lbrace 1, 2, \cdots, m \right\rbrace, v_i \neq w_j$.

Using that the additive inverse of $a \cdot v$ is $\left( -a \right) \cdot v$ and adding the additive inverses of each vector $\beta_i \cdot w_i$ in the last equation to obtain, $$\sum\limits_{i = 1}^{n} \alpha_i \cdot v_i + \sum\limits_{i = 1}^{m} \left( - \beta_i \right) \cdot w_i = \textbf{0}$$ Since $S_1 \cup S_2$ is linearly independent, $\forall i \in \left\lbrace 1, 2, \cdots, n \right\rbrace, \alpha_i = 0$ and $\forall i \in \left\lbrace 1, 2, \cdots, m \right\rbrace, - \beta_i = 0$, which in turn gives that $\beta_i = 0$. Therefore, $v = 0 \cdot v_1 + 0 \cdot v_2 + \cdots + 0 \cdot v_n = \textbf{0}$ and $w = 0 \cdot w_1 + 0 \cdot w_2 + \cdots + 0 \cdot w_m = \textbf{0}$. Hence, $1 \cdot \textbf{0} = \textbf{0}$.

Now, let us consider that $S_1 \cap S_2 \neq \emptyset$. Let there be $r$ vectors, where $0 \leq r \leq \min \left\lbrace m, n \right\rbrace$, which are common in $S_1$ and $S_2$. We shall name them, $v_i$, where $i \in \left\lbrace 1, 2, \cdots, r \right\rbrace$. Thus, our sets look like $S_1 = \left\lbrace v_1, v_2, \cdots, v_r, v_{r + 1}, \cdots, v_n \right\rbrace$ and $S_2 = \left\lbrace v_1, v_2, \cdots, v_r, w_{r + 1}, \cdots, w_m \right\rbrace$. Now, $v = \sum\limits_{i = 1}^{r} \alpha_i \cdot v_i + \sum\limits_{i = r + 1}^{n} \alpha_i \cdot v_i$ and $w = \sum\limits_{i = 1}^{r} \beta_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \beta_i \cdot w_i$. Again, \begin{align*} 1 \cdot \left( \sum\limits_{i = 1}^{r} \alpha_i \cdot v_i + \sum\limits_{i = r + 1}^{n} \alpha_i \cdot v_i \right) &= \sum\limits_{i = 1}^{r} \beta_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{r} 1 \cdot \left( \alpha_i \cdot v_i \right) + \sum\limits_{i = r + 1}^{n} 1 \cdot \left( \alpha_i \cdot v_i \right) &= \sum\limits_{i = 1}^{r} \beta_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{r} \left( 1 \alpha_i \right) \cdot v_i + \sum\limits_{i = r + 1}^{n} \left( 1 \alpha_i \right) \cdot v_i &= \sum\limits_{i = 1}^{r} \beta_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \beta_i \cdot w_i \\ \therefore \sum\limits_{i = 1}^{r} \alpha_i \cdot v_i + \sum\limits_{i = r + 1}^{n} \alpha_i \cdot v_i &= \sum\limits_{i = 1}^{r} \beta_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \beta_i \cdot w_i \\ \end{align*} Adding the additive inverses of each of the vectors on the right hand side of the equation to both the sides and using axioms of vector space multiple times, we get $$\sum\limits_{i = 1}^{r} \left( \alpha_i - \beta_i \right) \cdot v_i + \sum\limits_{i = r + 1}^{n} \alpha_i \cdot v_i + \sum\limits_{i = r + 1}^{m} \left( - \beta_i \right) \cdot w_i = \textbf{0}$$ Since $S_1 \cup S_2$ is linearly independent, $\forall i \in \left\lbrace 1, 2, \cdots, r \right\rbrace, \alpha_i - \beta_i = 0 \Rightarrow \alpha_i = \beta_i$. Also, $\forall i \in \left\lbrace r + 1, r + 2, n \cdots \right\rbrace, \alpha_i = 0$ and $\forall i \in \left\lbrace r + 1, r + 2, \cdots, m \right\rbrace, \beta_i = 0$. This tells us that $v = \sum\limits_{i = 1}^{r} \alpha_i \cdot v_i$ and $w = 1 \cdot v = \sum\limits_{i = 1}^{r} \alpha_i \cdot v_i$. Hence, $1 \cdot v = v$.

Note that we can prove $- \left( a \cdot v \right) = \left( -a \right) \cdot v$ as follows

\begin{align} \left( -a \right) \cdot v + a \cdot v = \left( -a + a \right) \cdot v = 0 \cdot v = 0 \end{align}

1

There are 1 best solutions below

8
On

Let $\Bbb F=\Bbb R$, $V=\Bbb Z$ and for $\alpha\in\Bbb F$, $v\in\Bbb Z$, let $\alpha v=0$. Then $v$ is an abelian group and $\cdot$ is an action of the ring-without-$1$ (sometimes called rng) $\Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.

The $V$ does not have a basis. Indeed, any linear combination $\alpha_1v_1+\alpha_2v_2+\ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.