A question on the dual relationship between the regressive product and the exterior product

1.4k Views Asked by At

I am trying to understand the following sentence, which I came across in a book:

The underlying beauty of the Ausdehnungslehre is due to this symmetry [the duality between the regressive and exterior product], which in turn is due to the fact that linear spaces of $m$-elements and linear spaces of $(n-m)$-elements have the same dimension...

According to the book, this is related to the dual nature of the regressive and exterior products:

For example, the exterior product of $m$ 1-elements is an $m$-element. The dual to this is that the regressive product of $m$ $(n-1)$-elements is an $(n-m)$-element.

What exactly is confusing about the above?

1) How do linear spaces of $m$-elements and linear spaces of $(n-m)$-elements have the same dimension? Actually, what is meant by dimension here? I understand dimension to be the number of $m$-elements necessary to build an object of a particular space - i.e. the grade of the object. Is this correct? With this view, $(n-m) \neq m$.

2) I don't see how the fact that the result of the regressive product when applied to $m$ $n-1$ elements is an $(n-m)$-element makes it the dual of the exterior product...

3) How is duality akin to symmetry? I am unable to appreciate the beauty in the duality between two operators.

2

There are 2 best solutions below

10
On BEST ANSWER

I'm just going to answer this in the language of geometric algebra (because I can't seem to translate it into Grassmann algebra right).


A $k$-blade is the exterior product of $k$ linearly independent vectors. In geometric algebra, the dual $A^*$ is the $(n-k)$-blade such that $A \wedge A^* = |A|^2I$ (but we're not particularly interested in that constant, $|A|^2$, here), where $I$ is called the unit pseudoscalar -- the highest grade element of our $n$-space. Thus we can consider this "dual" to be the part of $I$ which doesn't "contain" any of $A$.

The "regressive product" seems to be what I've heard called the "meet" of blades. The meet $M=A \vee B$ (which if you learn Geometric Algebra has the simple formula $A \vee B = A^* \cdot B$) is the blade of largest grade such that $A=M \wedge A'$ and $B=B' \wedge M$. Thus the regressive product is the part of $A$ and $B$ "contained" by both $A$ and $B$.

You'll want to note that in $\Bbb G^n$, there are $\binom{n}{k}$ orthogonal unit $k$-blades. Thus each $k$-space (the subspace of elements of grade $k$) has the same dimension as $(n-k)$-space. Where dimension here means the same thing it does in linear algebra: it's the maximum number of linearly independent "basis blades" the any $k$-space can hold -- this is distinct from the grade of a $k$-space/ $k$-blade. For example, take our space to be $\Bbb G^4$. This is the space of scalars, vectors, 2-blades, 3-blades, 4-blades, and the sums of these objects. In this space there are $\binom 42=6$ basis 2-blades -- usually they are called $e_1 \wedge e_2, e_1 \wedge e_3, e_1 \wedge e_4, e_2 \wedge e_3, e_2 \wedge e_4, $ and $e_3 \wedge e_4$ -- and thus the dimension of the subspace formed by the span of these basis blade is $6$. However each element of the subspace is of grade $2$.

The fact that each $k$-space then has the same dimension as $(n-k)$-space is really the important part for understanding what your book is saying:

What this section of your book is saying is that: $ (\wedge^m\ V)^* = \vee^m\ V^*$ and (by the fact that the dual of the dual of an element is that same element up to sign) likewise $(\vee^m\ V)^* = \wedge^m\ V^*$. That is $(A \wedge B)^*=A^* \vee B^*$. So the smallest blade "containing" $A$ & $B$ is dual to the largest blade "contained" by both the dual of $A$ and the dual of $B$.

Because in geometric algebra we have a very simple formula for finding the dual of a blade (for reference it's $A^* = AI^{-1}$), we could theoretically represent any exterior product by regressive products (AKA meets) and vice versa. I think this rather unexpected and potentially powerful result is what this author is calling the "beauty" of this relation.


I don't know if any of this has made any sense to you, so just comment below if you still have questions.

2
On

I'm currently also studying Grassmann algebra. I'm a graphics & game programmer, not a mathematician, so I apologize for using wrong terminology and mistakes.

The way I see it:

  • "dimension" is the minimum set of vectors of a vectorspace that span the entire space. This set is called a basis. Spanning means we can build each vector in the vectorspace from a unique linear combination of the basis vectors.

  • so if we have an N-dimensional vectorspace, we need exactly N vectors to span the space.

To explain the duality between the exterior and regressive product, consider the following example:

Suppose we have a 4D linear space V spanned by basis vectors {e1,e2,e3,e4}. This means that every element in V can be uniquely written as a1*e1 + a2*e2 + a3*e3 + a4*e4 for scalars {a1,a2,a3,a4}

When we add a Grassmann algebra ˄(V) to V, we get several subspaces from combining the vectors using the exterior aka wedge ˄ product:

  • the sub-space of 0-elements (scalars, blades of grade 0) has dimension 1: the basis {1} spans all other scalars.

  • the sub-space of 1-elements (vectors, blades of grade 1) has dimension 4: the basis is {e1,e2,e3,e4}, the same one as our 4D vectorspace we started with

  • the sub-space of 2-elements (bi-vectors, blades of grade 2) has dimension 6: the basis is {e1˄e2, e1˄e3, e1˄e4, e2˄e3, e2˄e4, e3˄e4} := {e12, e13, e14, e23, e24, e34}. Every bivector can be build from a linear combination of bivectors in this basis

  • the sub-space of 3-elements (tri-vectors, blades of grade 3, aka 4D anti-vectors) has dimension 4: the basis is {e1˄e2˄e3, e1˄e2˄e4, e1˄e3˄e4, e2˄e3˄e4} := {e123, e124, e134, e234}.

  • the sub-space of 4-elements (quad-vectors, blades of maximum grade 4, aka 4D pseudo-scalars or anti-scalars) has dimension 1: the basis is {e1˄e2˄e3˄e4} := {e1234} := {I}.

Notice the dimensions of these sub-spaces: 1 4 6 4 1

In general, for an N-dimensional vector-space, the dimensions of the sub-spaces formed by the exterior products of vectors are found in the Nth row of a Pascal triangle, or can be computed using the binomial.

We clearly see a symmetry here! Instead of defining an exterior product that works on vectors, we can just as well define a regressive product that works on anti-vectors.

We first define a complementary "anti-basis" {ē1,ē2,ē3,ē4} (note that we are not talking about orthogonality here, we need no metric at all).

We define the complements ~ei = ēi in such a way the ei ˄ (ēi) = e1˄e2˄e3˄e4 = I

  • ē1 = +e2˄e3˄e4 because e1 ˄ (ē1) = e1˄(+e2˄e3˄e4) = e1˄e2˄e3˄e4 (no swaps needed)
  • ē2 = -e1˄e3˄e4 because e2 ˄ (ē2) = e2˄(-e1˄e3˄e4) = e1˄e2˄e3˄e4 (1 swap e2 <-> e1 needed)
  • ē3 = +e1˄e2˄e4 because e3 ˄ (ē3) = e3˄(+e1˄e2˄e4) = e1˄e2˄e3˄e4 (2 swaps e3 <-> e1 <-> e2 needed)
  • ē4 = -e1˄e2˄e3 because e4 ˄ (ē4) = e4˄(-e1˄e2˄e3) = e1˄e2˄e3˄e4 (3 swaps e4 <-> e1 <-> e2 <-> e3 needed)

Note that ~1 = I and ~I = 1 because 1 ^ I = 1 and I ^ 1 = I. So the complement of the unit-scalar is the unit anti-scalar and vice-versa. They are ‘dual’ under complement, just like Boolean algebra. Grassmann didn't make a difference between pseudo-scalars and scalars, he even didn't make a notational difference between the exterior and regressive products.

We define the regressive product ˅ to form new "anti-sub-spaces" using {ē1,ē2,ē3,ē4} as a "anti-basis":

  • the 1D anti-sub-space anti-scalars: basis {I} = {~1}
  • the 4D anti-sub-space anti-vectors: basis {ē1,ē2,ē3,ē4}
  • the 6D anti-sub-space anti-bivectors: basis {ē1˅ē2, ē1˅ē3, ē1˅ē4, ē2˅ē3, ē2˅ē4, ē3˅ē4} = {ē12, ē13, ē14, ē23, ē24, ē34}.
  • the 4D anti-sub-space of anti-tri-vectors: basis {ē1˅ē2˅ē3, ē1˅ē2˅ē4, ē1˅ē3˅ē4, ē2˅ē3˅ē4} = {ē123, ē124, ē134, ē234}
  • the 1D anti-sub-space of scalars: basis {1}

This is “dual” in the sense that we just replaced ˄ by ˅, e by ē, 1 by I

The dual of a blade with grade M (e.g. e123 with grade 3) maps to an anti-blade (ē123 = e4) with “anti-grade” M or grade N-M (4-1 = 1)

The regressive product can be written using the exterior product by just using the complement operator ~: for any blades x and y, we have something that looks like the De Morgan Law:

~(x˄y) = (~x) ˅ (~y) at least upto a sign I think

The regressive product corresponds to the ‘meet’ (“intersection”) from geometric algebra I guess (I'm first tackling Grassmann before starting with GA), for example:

 (e123) ˅ (e234) 
 = 
 ~~((e123) ˅ (e234)) 
 = 
 ~(~e123 ˄ ~e234) 
 = 
 ~(e4 ˄ -e1)
 =
 -~e41
 =
 e23

which indeed is what e123 and e234 have in common

The interior product can also be defined as

x * y = x ˅ ~y