Understanding pseudovectors and pseudoscalars

229 Views Asked by At

In doing an exercise about pseudovectors

How do the components of a cross product transform under inversion?

I got stuck. I didn't know how to compute the inversion of a cross product.

My naive approach was as follows. Say we we invert the cross product $\mathbf a = \mathbf b \times \mathbf c$ about the origin. We then have $\mathrm{inv} (\mathbf a) = \mathrm{inv} (a_x,a_y,a_z)=(-a_x,-a_y,-a_z)$, and we're done.

This means $\mathrm{inv}(\mathbf a) = (-)\mathbf a = - \mathbf b \times \mathbf c$.

At this point I have no means of checking whether this is correct or not.

Hence, I now choose $\mathbf b = \hat{\mathbf{x}}$ and $\mathbf c =\hat{\mathbf{y}}$, so that $\mathbf a = \hat{\mathbf z}$.

Now, $\mathrm{inv}(\hat{\mathbf z}) = -\hat{\mathbf z} = - \hat{\mathbf x} \times \hat{\mathbf y} $.

This I can check, using the right-hand rule.

$ - \hat{\mathbf x} \times \hat{\mathbf y} = \hat{\mathbf y} \times \hat{\mathbf x} = - \hat{\mathbf z} .$

Hence, this seems to check out, in keeping with my naive calculations. Why is this wrong?

Reading this question and it's answer, I seem to have to calculate the inversion of $\mathbf a$ about the origin by inverting its components seperately: $\mathrm{inv}(\mathbf a) = \mathrm{inv}(\mathbf b) \times \mathrm{inv}(\mathbf c)$. How could I have derived that you should invert the components $\mathbf b, \mathbf c$ seperately?

In the answer, the argument goes that we simply define the action of the linear operator $\mathrm{inv}$ on a bivector (?) that way. But how does this --- $\mathrm{inv}(\mathbf a) = \mathrm{inv}(\mathbf b) \times \mathrm{inv}(\mathbf c)$--- follow naturally?

Also, bivectors are associative. But cross products aren't. How is this relevant to the discussion?

If I would accept this rule, then my answer to my question is readily given: we simply have $$\mathrm{inv}(\mathbf b) \times \mathrm{inv}(\mathbf c) = (-\mathbf b) \times (-\mathbf c) = \mathbf a$$ hence the cross product is invariant under inversions.

My post relates directly to this question.

This answer, to that question question attempts to answer my question by saying the laws of physics are converved under coordinate inversion, which I find unalluminating. I read this same sentence

One of the most powerful ideas in physics is that physical laws do not change when one changes the coordinate system used to describe these laws.

in the Wikipedia article on pseudoscalars.

By using the same rule --- inverting the components seperately --- on the scalar triple product, the answer follows readily: $$\mathrm{inv}(\mathbf d \cdot \mathbf e \times \mathbf f) = \mathrm{inv}(\mathbf d) \cdot \mathrm{inv}(\mathbf e) \times \mathrm{inv}(\mathbf f) = - \mathbf d \cdot \mathbf e \times \mathbf f \ .$$

My same question in this case: how does this rule of inverting the seperate components follow naturally?

I want to deeply understand what relates pseudovectors and pseudoscalars to bivectors and trivectors respectively, in a clear way. This answer, atttempts to relate them to the exterior and Clifford algebra, which I don't follow. It says that inversion changes the handedness of a trivector:

Trivectors, too, are different from ordinary scalars. All trivectors can be written as a scalar multiple of $\epsilon \equiv \hat x \wedge \hat y \wedge \hat z$, and we can interpret this as an oriented volume. If some trivector $\tau = \alpha \epsilon$ for some positive $\alpha$, then $\tau$ is right-handed. If $\alpha < 0$, then $\tau$ is left-handed.

What do we mean by changing the handedness?

I decided to ask this question on math.stackexchange instead of physics.stackexchange because the question relates to exterior- and Clifford algebra. I also haven't used any applications in physics in the post.

1

There are 1 best solutions below

7
On

You should imagine that when you take $\mathbf x, \mathbf y \in \mathbb R^3$ and compute $\mathbf x \times \mathbf y$, the result lives in a "different $\mathbb R^3$". We call elements of the first $\mathbb R^3$ "vectors", and elements of the second $\mathbb R^3$ "pseudovectors". It is legal to add two vectors together; it is legal to add two pseudovectors together; it is illegal to add a vector and a pseudovector.

So the question is:

If we reflect the first $\mathbb R^3$ through the origin, what happens to the second $\mathbb R^3$?"

We have a transformation $\text{inv}$ that takes a vector $(x,y,z)$ to a vector $\text{inv}(x,y,z) = (-x,-y,-z)$. We also have a corresponding transformation, which you're also calling $\text{inv}$, but which I'll call for clarity $\text{inv}^{\times}$. It operates on the second $\mathbb R^3$, where pseudovectors live. We don't know what it does yet, but we'd like it to do something that plays nicely with cross product: we'd like to have $$ \text{inv}^\times(\mathbf x \times \mathbf y) = \text{inv}(\mathbf x) \times \text{inv}(\mathbf y). $$ Now, if we apply the definition of cross product, we get that $$ \text{inv}(\mathbf x) \times \text{inv}(\mathbf y) = \det\begin{bmatrix} \mathbf i & \mathbf j & \mathbf k \\ -x_1 & -x_2 & -x_3 \\ -y_1 & -y_2 & -y_3\end{bmatrix} = (-1) \det\begin{bmatrix} \mathbf i & \mathbf j & \mathbf k \\ x_1 & x_2 & x_3 \\ -y_1 & -y_2 & -y_3\end{bmatrix} = \\ = (-1)^2 \det\begin{bmatrix} \mathbf i & \mathbf j & \mathbf k \\ x_1 & x_2 & x_3 \\ y_1 & y_2 & y_3\end{bmatrix} = \mathbf x \times \mathbf y. $$ We conclude that $\text{inv}^\times$ must be the identity: $\text{inv}^\times(\mathbf z) = \mathbf z$ for all $\mathbf z$ in the "second $\mathbb R^3$".