It seems like in order for a vector addition to be commutative, it needs to be defined in a "regular" manner, i.e. by adding matching vector components (because then the commutativity of vector addition fully reduces to commutativity of addition over the associated field). The key word is matching. Let's define a variant of $\mathbb{R}^2$ called $\mathbb{V}$, with addition of vectors $\alpha, \beta \in \mathbb{V}$ defined as $\alpha+\beta = (x_1 + y_2, y_1+x_2)$. Clearly this can't be commutative because swapping $\alpha$ and $\beta$ will reverse the order of components. If the ordered pair was replaced with a set, commutativity would be satisfied. However, other properties like associativity would be sacrificed by the very mismatching of components, even if we replace ordered pairs with braces. That begs the question, can a vector space with unordered vector components even exist?
2026-03-28 09:43:37.1774691017
Can a vector with unordered components exist?
69 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in VECTOR-SPACES
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Does curl vector influence the final destination of a particle?
- Closure and Subsets of Normed Vector Spaces
- Dimension of solution space of homogeneous differential equation, proof
- Linear Algebra and Vector spaces
- Is the professor wrong? Simple ODE question
- Finding subspaces with trivial intersection
- verifying V is a vector space
- Proving something is a vector space using pre-defined properties
- Subspace of vector spaces
Related Questions in BINARY-OPERATIONS
- Produce solutions such that $k$&$x$ $=$ $k$,in a range ($0$,$n$)
- Solve an equation with binary rotation and xor
- In a finite monoid (M, $\circ$) if the identity element $e$ is the only idempotent element, prove that each element of the monoid is invertible.
- Define a binary operation on the set of even integers which is different from addition,substraction and multiplication
- Basic problems on Group Theory
- Doubt in a proof of dropping parentheses with associativity
- how to show that the group $(G,+)$ is abelian
- Why quaternions is a group?
- Define a binary operation * on the real numbers as $x * y=xy+x+y$ for all real numbers x and y.
- Determining if the binary operation gives a group structure
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
TLDR
NO, there is no such thing as a vector space where vector addition does not respect the basis as you have described. There ARE linear transformations that can create this, but these are operations that take place on elements of a vector space, and the output of which are is also an element of the vector space.
Redefining vector addition changes the set from a standard vector space to an entirely different object. We lose commutativity of course, but we lose associativity also. I am not sure if this can reasonably form a vector space at all.
A vector space $V$ is defined to be a set over a field such that certain properties are met. Abandoning one or more of these properties means that whatever set we are discussing concept is no longer a vector space.
Further, an
ordered basisis a misnomer. An ordered set implies an ordering relation among the elements of the set (e.g. 1<2, 2<3) so that the elements of the set may be meaningfully compared. A vector space does not inherently come equipped with an 'ordering' that extends beyond the field which it is defined over. A vector from the vector space may be 'to the right of' another vector (e.g. $v,w\in V$,$v\cdot \hat{x} < w\cdot\hat{x}$ where $\hat{x}$ is a vector indicating 'right' and is usually a unit vector) however no such relation is guaranteed for vector spaces of more than one dimension. A perfect example is complex numbers.Instead an
ordered basisis entirely arbitrary, but fixed. That is to say given a vector space with basis $\hat{a}$,$\hat{b}$,$\hat{c}$,$\dots$ an ordering may be imposed or asserted such as $\{\hat{a},\hat{b},\hat{c},\dots\}$. There are equivalently valid orderings such as $\{\hat{c},\hat{a},\hat{b},\dots\}$ or any other permutation you may desire.The main effect of this ordering is to assert that the
matrix representationof the vector matches the ordering chosen. In other words the mapping between representations is valid $$ x_1\hat{a}+x_2\hat{b}+x_3\hat{c}+\dots \mapsto \begin{bmatrix} x_1\\x_2\\x_3\\ \vdots \end{bmatrix} $$I think it is best to define matricies as generalized representations of
linear combinations. If we do this, it is easy to see exactly what a chosen ordering does to a vector. Recall that the matrix multiplication that represents the euclidean inner product takes the form of one column vector and one row vector multiplied to yield a single scalar. In this sense the inner product can be described as a linear combination of the components from one vector with the components of the other vector. $$ \begin{bmatrix}x_1 & x_2 & x_3 & \dots \end{bmatrix} \begin{bmatrix}y_1 \\ y_2 \\ y_3 \\ \vdots \end{bmatrix} = x_1 y_1 + x_2 y_2 + x_3 y_3 + \dots $$If we take matricies as representing the act of forming linear combinations, then we may write a vector in matrix form given a basis as $$ x_1\hat{a} + x_2\hat{b} + x_3\hat{c} + \dots = \begin{bmatrix} \hat{a} & \hat{b} & \hat{c} & \dots \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \dots \end{bmatrix} $$ Hopefully it is clear that choosing a new order for the basis merely changes the sequence that the components appear in the column matrix. Usually any equations that we would talk about in an undergraduate setting do not involve the basis beyond setting up differential equations for the coordinates, at least until a course in general relativity or differential geometry is taken, and so for most puposes the 'basis matrix' can be dropped altogether.
Finally, the operation you describe is not
vector addition. This is a completely different operation. Let us give the symbol $\oplus$ to the operation you describe. Taking the $\oplus$ operation between two vectors $x,y$ can be represented as $$ x \oplus y = (x_1+y_2)\hat{a} + (x_2 + y_1)\hat{b} = \begin{bmatrix} \hat{a} & \hat{b} \end{bmatrix} \begin{bmatrix} x_1+y_2\\ x_2 + y_1\\ \end{bmatrix} $$This operation can be represented as a standard vector sum between $x$ and $y'= \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} $ Thus we have $$ x \oplus y = \begin{bmatrix}x_1\\x_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} $$ We can define the transformation matrix being applied as $T$, so that $y'=T(y)$.
It is worth noting here now that the operation $y \oplus x$ is the just $T(x\oplus y)$ since $T(T(x))=x$. Thus
$$ y \oplus x = \begin{bmatrix}y_1\\y_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}x_1\\x_2\end{bmatrix} = \begin{bmatrix}0&1\\1&0\end{bmatrix} \left( \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix}+ \begin{bmatrix}x_1\\x_2\end{bmatrix} \right) = \begin{bmatrix}0&1\\1&0\end{bmatrix} \left( \begin{bmatrix}x_1\\x_2\end{bmatrix}+ \begin{bmatrix}0&1\\1&0\end{bmatrix} \begin{bmatrix}y_1\\y_2\end{bmatrix} \right) = T(x\oplus y) $$ This proves that commutativity is lost, and so the vector addition you have defined does not constitute a vector space.
Further, associativity is lost. $$ x\oplus(y\oplus z) = \begin{bmatrix} x_1+y_2+z_1\\x_2+y_1+z_2 \end{bmatrix} \ne (x\oplus y)\oplus z = \begin{bmatrix} x_1+y_2+z_2\\x_2+y_1+z_1 \end{bmatrix} $$
I'd look into commutators if this is something you are interested in.