I have a question about the formalization of pseudovectors. Wikipedia (and my electromagnetism professor and all the electromagnetism books) only state that a vector $v$ transforms as $v' = Rv$, while a pseudovector $u$ transforms as $u' = \mathrm{det}(R)u$ under a rotation $R$, but that doesn't seem to make sense to me as a definition. To me, a pseudovector seems to be just an image of a function $f:V \to V: v \mapsto f(v) = w$, where the property $f(Rv) = \mathrm{det}(R)Rf(v)$ holds, and nothing more. In fact, they seem to be vectors according to the axioms of a Vector Space (pseudovector + pseudovector is a pseudovector, $0 \cdot u = 0$, etc.). This becomes even more confusing when I hear people say, especially in the context of electromagnetism, that a pseudovector can be thought of as an antisymmetric matrix. I don't understand how this relates to the initial definition, and the places where i see something like this discussion use a way more complex mathematics that the one i am capable of understand.
2026-03-25 21:53:21.1774475601
Pseudo-vector formal definition
131 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in VECTOR-SPACES
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Does curl vector influence the final destination of a particle?
- Closure and Subsets of Normed Vector Spaces
- Dimension of solution space of homogeneous differential equation, proof
- Linear Algebra and Vector spaces
- Is the professor wrong? Simple ODE question
- Finding subspaces with trivial intersection
- verifying V is a vector space
- Proving something is a vector space using pre-defined properties
- Subspace of vector spaces
Related Questions in DEFINITION
- How are these definitions of continuous relations equivalent?
- If a set is open, does it mean that every point is an interior point?
- What does $a^b$ mean in the definition of a cartesian closed category?
- $\lim_{n\to \infty}\sum_{j=0}^{[n/2]} \frac{1}{n} f\left( \frac{j}{n}\right)$
- Definition of "Normal topological space"
- How to verify $(a,b) = (c,d) \implies a = c \wedge b = d$ naively
- Why wolfram alpha assumed $ x>0$ as a domain of definition for $x^x $?
- Showing $x = x' \implies f(x) = f(x')$
- Inferior limit when t decreases to 0
- Is Hilbert space a Normed Space or a Inner Product Space? Or it have to be both at the same time?
Related Questions in ELECTROMAGNETISM
- Kirchhoff's Law and Highest Potential Node
- can I solve analytically or numerically the equation $\vec{\nabla}\cdot\vec{J}=0$ with the following boundaries?
- How to make the Biot-Savart law to go along a spiral shaped coil?
- I am currently reading Special Relativity by Woodhouse, I need help with understanding divergence of magnetic fields
- Calculation of capacitance between two cylinders
- Find directions where current is maximal
- What is the relation between 2d Fourier Transform and Plane Waves?
- Magnetic force term in Kobe's derivation of Maxwell's equations
- Expansion of 1/R
- Gauss' law and a half-cylinder
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is not explained well anywhere that I know of. The formal mathematical difference between a vector and a pseudovector is, in my opinion, best explained in the language of representation theory, to say precisely what it means that a vector and a pseudovector do not transform in the same way under a change of coordinates.
First, let $V$ be a vector space. From $V$ we can construct various other vector spaces, such as the tensor products $V^{\otimes n}$, the symmetric and exterior powers, the symmetric and antisymmetric tensors, and so forth. Each of these acquires an action of the group $GL(V)$ of linear transformations $V \to V$, and so becomes a representation of that group. This group action is how a mathematician thinks about how tensors of various kinds transform under change of coordinates.
Now, it is true that all of these objects are vector spaces, so their elements "are vectors" in that sense. But when we say that some object $X$ is a tensor and not a vector, we are working in a context where we fix a "base" vector space $V$ and construct other vector spaces out of it, and we mean that $X$ is an element of some other representation rather than an element of $V$. So in this context "vector" should be read as "the vector (defining) representation $V$," and when we say an object is not a vector in this context we mean it's an element of some other representation.
So, how to understand pseudovectors in this context? You have probably heard that
How do we make sense of this? If $V$ is a vector space, there is a generalization of the cross product called the wedge product or exterior product, and if $v, w \in V$ the wedge product $v \wedge w$ takes values in the exterior square $\Lambda^2(V)$, which is a quotient of $V^{\otimes 2}$ and in general quite a different representation of $GL(V)$. If $V$ is a finite-dimensional real vector space equipped with an inner product, then we can identify $V$ with its dual $V^{\ast}$ (which is a non-isomorphic representation of $GL(V)$ but an isomorphic representation of the orthogonal group $O(V)$ with respect to this inner product). This identification lets us identify $V^{\otimes 2}$ with $V \otimes V^{\ast} \cong \text{End}(V)$, and if we pick an orthonormal basis and identify $V$ with $\mathbb{R}^n$ this identification turns out to identify the antisymmetric tensors in $V^{\otimes 2}$ with antisymmetric matrices in $\text{End}(V) \cong M_n(\mathbb{R})$. We can further always identify exterior powers with antisymmetric tensors.
So, this is why we can identify $\Lambda^2(V)$ with antisymmetric matrices (if $V$ has an inner product). This is significant because the antisymmetric matrices $\mathfrak{o}(V) \cong \mathfrak{so}(V)$ are the Lie algebra of the orthogonal group $O(V)$; this says exactly that antisymmetric matrices are what you get when you differentiate rotations, and that's what angular velocities are! So, that's why cross products = elements of the exterior square = angular velocities.
Okay, but we're not done: a pseudovector is supposed to be something that looks almost but not quite like a vector. How do we get one of those? Now we need to assume, in addition to assuming that $V$ is a real inner product space, that $V$ is exactly $3$-dimensional. In that case there is another wedge product operation
$$\wedge : V \otimes \Lambda^2(V) \to \Lambda^3(V)$$
where $\Lambda^3(V)$ is $1$-dimensional; as a representation of $GL(V)$ or $O(V)$ it sends a linear transformation $T$ to scalar multiplication by its determinant $\det(T)$, and so we might also use the notation $\det(V)$ for this representation. This wedge product is a nondegenerate bilinear map and so it gives us an isomorphism
$$\Lambda^2(V) \cong V^{\ast} \otimes \det(V)$$
of $GL(V)$-representations. Further applying the isomorphism $V^{\ast} \cong V$ induced by an inner product, we get an isomorphism
$$\Lambda^2(V) \cong V \otimes \det(V)$$
of $O(V)$-representations. This is a slight variation of the Hodge star where we don't fix an orientation.
This is quite a lot of abstract machinery but in the end this is how we get something that looks almost like a vector (in the sense that this representation is almost $V$) but that picks up an extra sign when it's reflected (which is what that $\det(V)$ does), and is where cross products take values in, and is the sort of thing that an angular velocity is. It would be a somewhat tedious exercise but it would be possible in principle to explicitly chase through all of these isomorphisms using an orthonormal basis and seeing what happens when you change basis to another orthonormal basis, to see very explicitly the difference between each of these representations as well as what all of these natural isomorphisms are doing.
A philosophical point is that the $3$-dimensional case is actually quite confusing because there are a bunch of $3$-dimensional vector spaces running around so it's easy to assume they're all the same $3$-dimensional vector space but they don't all transform the same way under change of coordinates. In $n$ dimensions $\Lambda^2(V)$ has dimension ${n \choose 2}$ so in general it's a lot harder to confuse for $V$, e.g. an angular velocity does not look like a single vector in higher dimensions but really has to be thought of as an antisymmetric matrix.