Deriving geometric product of basis vectors from outer product

204 Views Asked by At

I was exploring geometric algebra in general, when I came across these two formulas relating the inner and outer products of two objects with their geometric products:

$$\begin{align} \mathbf{u} \cdot \mathbf{v} &= \frac{1}{2}(\mathbf{uv} + \mathbf{vu}) \\ \mathbf{u} \wedge \mathbf{v} &= \frac{1}{2}(\mathbf{uv} - \mathbf{vu}) \\ \end{align}$$

I also learned that the geometric products of orthogonal vectors are anticommutative (which means that such of basis vectors of an orthogonal basis are too):

$$\begin{align} \mathbf{\hat{e}_1}\mathbf{\hat{e}_2} &= -\mathbf{\hat{e}_2}\mathbf{\hat{e}_1} \\ \mathbf{\hat{e}_1}\mathbf{\hat{e}_3} &= -\mathbf{\hat{e}_3}\mathbf{\hat{e}_1} \\ \mathbf{\hat{e}_2}\mathbf{\hat{e}_3} &= -\mathbf{\hat{e}_3}\mathbf{\hat{e}_2} \\ \end{align}$$

As part of my exploration, I tried to derive $\mathbf{\hat{e}_1}\wedge\mathbf{\hat{e}_2}\wedge\mathbf{\hat{e}_3} = \mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\mathbf{\hat{e}_3}$ by working forwards using the definition of the outer product. However, I ended up proving that $\mathbf{\hat{e}_1}\wedge\mathbf{\hat{e}_2}\wedge\mathbf{\hat{e}_3} = 0$ ...

$$\begin{align} &\ \mathbf{\hat{e}_1}\wedge\mathbf{\hat{e}_2}\wedge\mathbf{\hat{e}_3} \\ &= \frac{1}{2}(\mathbf{\hat{e}_1}\mathbf{\hat{e}_2} - \mathbf{\hat{e}_2}\mathbf{\hat{e}_1})\wedge\mathbf{\hat{e}_3} \\ &= \frac{1}{2}(\mathbf{\hat{e}_1}\mathbf{\hat{e}_2} + \mathbf{\hat{e}_1}\mathbf{\hat{e}_2})\wedge\mathbf{\hat{e}_3} \\ &= \mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\wedge\mathbf{\hat{e}_3} \\ &= \frac{1}{2}(\mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\mathbf{\hat{e}_3} - \mathbf{\hat{e}_3}\mathbf{\hat{e}_1}\mathbf{\hat{e}_2}) \\ &= \frac{1}{2}(\mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\mathbf{\hat{e}_3} + \mathbf{\hat{e}_1}\mathbf{\hat{e}_3}\mathbf{\hat{e}_2}) \\ &= \frac{1}{2}(\mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\mathbf{\hat{e}_3} - \mathbf{\hat{e}_1}\mathbf{\hat{e}_2}\mathbf{\hat{e}_3}) \\ &= 0\ \text{(subtraction cancels to zero)} \\ \end{align}$$

Where am I going wrong?

2

There are 2 best solutions below

3
On BEST ANSWER

A more fundamental approach is to use the grade selection operators and the contraction axiom $ \mathbf{x}^2 = \mathbf{x} \cdot \mathbf{x} $ to define wedge products, to demonstrate the anticommutitivity property, and to prove the symmetric product identity for the dot product.

The anticommutive property follows from the contraction and distribution axioms. If $ \mathbf{a} \cdot \mathbf{b} = 0 $, then we have $$\begin{aligned} 0 &= \left( { \mathbf{a} + \mathbf{b} } \right)^2 - \left( { \mathbf{a} + \mathbf{b} } \right) \cdot \left( { \mathbf{a} + \mathbf{b} } \right) \\ &= \mathbf{a}^2 + \mathbf{b}^2 + \mathbf{a} \mathbf{b} + \mathbf{b} \mathbf{a} - \mathbf{a} \cdot \mathbf{a} - \mathbf{b} \cdot \mathbf{b} - \mathbf{a} \cdot \mathbf{b} - \mathbf{b} \cdot \mathbf{a} \\ &= \mathbf{a} \mathbf{b} + \mathbf{b} \mathbf{a},\end{aligned}$$ so for any orthogonal vectors $ \mathbf{a}, \mathbf{b} $, we have $$\mathbf{a} \mathbf{b} = -\mathbf{b} \mathbf{a}$$

Now supposed that we have an Euclidean orthonormal basis $ \left\{ {\mathbf{e}_1, \mathbf{e}_2, \cdots} \right\} $. Let $ \mathbf{u} = \sum_k \mathbf{e}_k u_k $ and $ \mathbf{v} = \sum_k \mathbf{e}_k v_k $, then $$\begin{aligned} \mathbf{u} \mathbf{v} &= \sum_{j,k} \mathbf{e}_j \mathbf{e}_k u_j v_k \\ &= \sum_{j = k} \mathbf{e}_j \mathbf{e}_k u_j v_k + \sum_{j \ne k} \mathbf{e}_j \mathbf{e}_k u_j v_k \\ &= \sum_{j} (\mathbf{e}_j \cdot \mathbf{e}_j) u_j v_j + \sum_{j < k} \mathbf{e}_j \mathbf{e}_k \left( { u_j v_k - u_k v_j } \right).\end{aligned}$$ The scalar part of this sum is completely symmetric, whereas the bivector portion of this sum is completely antisymmetric (this is a general statement, and can also be shown easily for non-Eucidean bases). We must then have $$ \left\langle{{\mathbf{u} \mathbf{v}}}\right\rangle = \frac{1}{{2}} \left( { \mathbf{u} \mathbf{v} + \mathbf{v} \mathbf{u} } \right),$$ and $$ {\left\langle{{\mathbf{u} \mathbf{v}}}\right\rangle}_{2} = \frac{1}{{2}} \left( { \mathbf{u} \mathbf{v} - \mathbf{v} \mathbf{u} } \right)$$ It is also clear that the scalar portion of this coordinate expansion is the dot product of the two vectors, which means $$ \mathbf{u} \cdot \mathbf{v} = \frac{1}{{2}} \left( { \mathbf{u} \mathbf{v} + \mathbf{v} \mathbf{u} } \right),$$ Now, we define the wedge of two vectors as $$ \mathbf{u} \wedge \mathbf{v} = {\left\langle{{ \mathbf{u} \mathbf{v} }}\right\rangle}_{2},$$ from which we see $$ \mathbf{u} \wedge \mathbf{v} = \frac{1}{{2}} \left( { \mathbf{u} \mathbf{v} - \mathbf{v} \mathbf{u} } \right).$$

We define the wedge of three vectors as $$ \mathbf{u} \wedge \mathbf{v} \wedge \mathbf{w} = {\left\langle{{ \mathbf{u} \mathbf{v} \mathbf{w} }}\right\rangle}_{3}.$$ More generally, we define the dot and wedge of a vector $ \mathbf{u} $ and a k-blade (the wedge of k vectors) $ V_k $ as $$\mathbf{u} \cdot V_k = {\left\langle{{ \mathbf{u} V_k }}\right\rangle}_{{k-1}},$$ and $$\mathbf{u} \wedge V_k = {\left\langle{{ \mathbf{u} V_k }}\right\rangle}_{{k+1}}.$$

and (still) more generally for a j-blade $ U_j $ and a k-blade $V_k$ $$U_j \cdot V_k = {\left\langle{{ U_j V_k }}\right\rangle}_{\left\lvert{k-j}\right\rvert},$$ and $$U_j \wedge V_k = {\left\langle{{ U_j V_k }}\right\rangle}_{{k+j}}.$$

4
On

The mistake is the fourth line passing to the fifth.

You are applying your identity to $u=e_1e_2$ and $v=e_3$ but $e_1e_2$ is not a vector.

The identity applies only to vectors $u,v$.

The relevant identity in your case is that

$$ a\wedge b=\frac12(ab-ba) $$

The correct extension of that is the generalized symmetrized product

$$ \bigwedge_{i=1}^n a_i=\frac{1}{n!}\sum_{\sigma\in Sym(n)}\prod_{i=1}^n a_{\sigma(i)} $$

In your case, $$ e_1\wedge e_2\wedge e_3\\ =\frac16(e_1e_2e_3+e_2e_3e_1+e_3e_1e_2 \\ -e_1e_3e_2-e_2e_1e_3-e_3e_2e_1)\\ =\frac16(6e_1e_2e_3)=e_1e_2e_3 $$