Is the uniqueness of the additive neutral element sufficient to prove x+z=x implies z=0?

3.4k Views Asked by At

The following was originally stated for n-tuples of elements from a scalar field, so most of the properties of "vectors" are easily established from the properties of the underlying scalar field. But the authors seem to want their development to be "self-reliant". For this reason I have replaced "n-tuple" with "vector".

The equality relation for vectors has been established, as have the associative and commutative laws of vector addition. The next property of vector addition to be introduced is the neutral element:

There exists a vector $\mathfrak{0}$ such that $\mathfrak{x}+\mathfrak{0}=\mathfrak{x}$ for every $\mathfrak{x}$. It follows there can be only one neutral element, for if $\mathfrak{0}$ and $\mathfrak{0}^{\prime}$ were two such elements we would have $\mathfrak{0}^{\prime}+\mathfrak{0}=\mathfrak{0}^{\prime}$ and $\mathfrak{0}+\mathfrak{0}^{\prime}=\mathfrak{0},$ so that by the commutative law of vector addition and the transitivity of vector equality we would have $\mathfrak{0}=\mathfrak{0}^{\prime}.$

Now suppose that for some $\mathfrak{x}$ we have $\mathfrak{x}+\mathfrak{z}=\mathfrak{x}.$ Do we have enough to prove that $\mathfrak{z}=\mathfrak{0}?$

I note in particular that the proof of the uniqueness of $\mathfrak{0}$ relies on the assumption that $\mathfrak{x}+\mathfrak{0}^{\prime}=\mathfrak{x}$ holds for all vectors, and thereby for $\mathfrak{x}=\mathfrak{0}$. That assumption comes from the definition of $\mathfrak{0}$ satisfying $\mathfrak{x}+\mathfrak{0}=\mathfrak{x}$ for every vector, and the assumption that $\mathfrak{0}^\prime$ is also 'such an element'.

Also note that the additive inverses have not yet been introduced.

3

There are 3 best solutions below

2
On BEST ANSWER

No, this cannot be proved from just associativity, commutativity, and existence of a neutral element. For instance, consider the set $[0,1]$ with the binary operation $a*b=\min(a,b)$. This operation is associative and commutative and $1$ is a neutral element. But for any $x,y$ with $x\leq y$, we have $x*y=x$, and $y$ is not necessarily the neutral element $1$.

0
On

For an example with a more additive flavor, let's extend the operation $+$ to a new element $\infty$ with the rule that $x+\infty=\infty+x=\infty$ for all $x$. You can check that $+$ is still associative and commutative, and $0$ is still its identity element. However, we have $\infty+7=\infty$ and $7\neq0$.

0
On

Since you are asking merely about the uniqueness of the identity of an abstract operation, and are not using any other structure on the space, we can posit that the operation "+" is isomorphic to "*" in some ring. In a ring, the property $x*y=y$ implies that $x*(y-1)=0$. Thus, insufficiency follows from the existence of rings with zero divisors. So, for instance, if we treat $[a_1,b_1]+[a_2,b_2]$ as being equal to $[a_1a_2,b_1b_2]$, then taking $x=[1,0]$, $y=[1,1]$ gives that $x+y=x$.

If more properties of the vector space are introduced that make the + operation incompatible with being isomorphic to the multiplicative operation of a ring with zero divisors (such as there being an inverse) are introduced, then those properties, in conjunction with the uniqueness of the identity, may be sufficient to establish the proposition in question.

Another viewpoint is treating $x+y$ as being equal to some function indexed by $y$ applied to $x$. That is, "$x+y$" represents y.add(x). That there is some object $0$ such that $x+0=x$ for all $x$ simply means that there is some $0$ such that 0.add() = lambda x: x. We can easily have $x$ and $y$ such that y.add(x) is equal to $x$, yet $y\neq0$.