Can a vector with unordered components exist?

69 Views Asked by At

It seems like in order for a vector addition to be commutative, it needs to be defined in a "regular" manner, i.e. by adding matching vector components (because then the commutativity of vector addition fully reduces to commutativity of addition over the associated field). The key word is matching. Let's define a variant of $\mathbb{R}^2$ called $\mathbb{V}$, with addition of vectors $\alpha, \beta \in \mathbb{V}$ defined as $\alpha+\beta = (x_1 + y_2, y_1+x_2)$. Clearly this can't be commutative because swapping $\alpha$ and $\beta$ will reverse the order of components. If the ordered pair was replaced with a set, commutativity would be satisfied. However, other properties like associativity would be sacrificed by the very mismatching of components, even if we replace ordered pairs with braces. That begs the question, can a vector space with unordered vector components even exist?

1

There are 1 best solutions below

2
On

TLDR

NO, there is no such thing as a vector space where vector addition does not respect the basis as you have described. There ARE linear transformations that can create this, but these are operations that take place on elements of a vector space, and the output of which are is also an element of the vector space.

Redefining vector addition changes the set from a standard vector space to an entirely different object. We lose commutativity of course, but we lose associativity also. I am not sure if this can reasonably form a vector space at all.


A vector space $V$ is defined to be a set over a field such that certain properties are met. Abandoning one or more of these properties means that whatever set we are discussing concept is no longer a vector space.

Further, an ordered basis is a misnomer. An ordered set implies an ordering relation among the elements of the set (e.g. 1<2, 2<3) so that the elements of the set may be meaningfully compared. A vector space does not inherently come equipped with an 'ordering' that extends beyond the field which it is defined over. A vector from the vector space may be 'to the right of' another vector (e.g. $v,w\in V$,$v\cdot \hat{x} < w\cdot\hat{x}$ where $\hat{x}$ is a vector indicating 'right' and is usually a unit vector) however no such relation is guaranteed for vector spaces of more than one dimension. A perfect example is complex numbers.

Instead an ordered basis is entirely arbitrary, but fixed. That is to say given a vector space with basis $\hat{a}$,$\hat{b}$,$\hat{c}$,$\dots$ an ordering may be imposed or asserted such as $\{\hat{a},\hat{b},\hat{c},\dots\}$. There are equivalently valid orderings such as $\{\hat{c},\hat{a},\hat{b},\dots\}$ or any other permutation you may desire.

The main effect of this ordering is to assert that the matrix representation of the vector matches the ordering chosen. In other words the mapping between representations is valid $$ x_1\hat{a}+x_2\hat{b}+x_3\hat{c}+\dots \mapsto \begin{bmatrix} x_1\\x_2\\x_3\\ \vdots \end{bmatrix} $$

I think it is best to define matricies as generalized representations of linear combinations. If we do this, it is easy to see exactly what a chosen ordering does to a vector. Recall that the matrix multiplication that represents the euclidean inner product takes the form of one column vector and one row vector multiplied to yield a single scalar. In this sense the inner product can be described as a linear combination of the components from one vector with the components of the other vector. $$ \begin{bmatrix}x_1 & x_2 & x_3 & \dots \end{bmatrix} \begin{bmatrix}y_1 \\ y_2 \\ y_3 \\ \vdots \end{bmatrix} = x_1 y_1 + x_2 y_2 + x_3 y_3 + \dots $$

If we take matricies as representing the act of forming linear combinations, then we may write a vector in matrix form given a basis as $$ x_1\hat{a} + x_2\hat{b} + x_3\hat{c} + \dots = \begin{bmatrix} \hat{a} & \hat{b} & \hat{c} & \dots \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \dots \end{bmatrix} $$ Hopefully it is clear that choosing a new order for the basis merely changes the sequence that the components appear in the column matrix. Usually any equations that we would talk about in an undergraduate setting do not involve the basis beyond setting up differential equations for the coordinates, at least until a course in general relativity or differential geometry is taken, and so for most puposes the 'basis matrix' can be dropped altogether.

Finally, the operation you describe is not vector addition. This is a completely different operation. Let us give the symbol $\oplus$ to the operation you describe. Taking the $\oplus$ operation between two vectors $x,y$ can be represented as $$ x \oplus y = (x_1+y_2)\hat{a} + (x_2 + y_1)\hat{b} = \begin{bmatrix} \hat{a} & \hat{b} \end{bmatrix} \begin{bmatrix} x_1+y_2\\ x_2 + y_1\\ \end{bmatrix} $$

This operation can be represented as a standard vector sum between $x$ and $y'= \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} $ Thus we have $$ x \oplus y = \begin{bmatrix}x_1\\x_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} $$ We can define the transformation matrix being applied as $T$, so that $y'=T(y)$.

It is worth noting here now that the operation $y \oplus x$ is the just $T(x\oplus y)$ since $T(T(x))=x$. Thus

$$ y \oplus x = \begin{bmatrix}y_1\\y_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}x_1\\x_2\end{bmatrix} = \begin{bmatrix}0&1\\1&0\end{bmatrix} \left( \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix}+ \begin{bmatrix}x_1\\x_2\end{bmatrix} \right) = \begin{bmatrix}0&1\\1&0\end{bmatrix} \left( \begin{bmatrix}x_1\\x_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} \right) = T(x\oplus y) $$ This proves that commutativity is lost, and so the vector addition you have defined does not constitute a vector space.

Further, associativity is lost. $$ x\oplus(y\oplus z) = \begin{bmatrix} x_1+y_2+z_1\\x_2+y_1+z_2 \end{bmatrix} \ne (x\oplus y)\oplus z = \begin{bmatrix} x_1+y_2+z_2\\x_2+y_1+z_1 \end{bmatrix} $$

I'd look into commutators if this is something you are interested in.