Why are particular combinations of algebraic properties "richer" than others?

821 Views Asked by At

Pedagogically, when students are exposed to algebraic structures it seems standard for the major emphasis, if not all the emphasis, to be on groups, rings, R-modules, and categories. These are rich structures with interesting properties, but in the big picture, I have wondered why some defining properties make for a rich structure, while other properties gives less interesting structures, or nothing worth teaching at all.

As a motivating example, a set (or class, whatever) that is closed under some operation seems necessary to talk about anything meaningful; however, why is the particular combination of

  1. Having inverse elements
  2. Having an identity element
  3. Associativity

more rich (a group) than simply replacing associativity with commutativity (a structure I don't even know a name for)? I have also wondered why associativity is much more prevalent than commutativity. As another motivating example, we teach much about groups and rings but why not loops, monoids, semilattices, and near-rings? What makes the former set either richer in structure or more pedagogically sound to teach?

Even in category theory I can ask what makes the specific combination of defining properties of a category so great. —why associativity and not commutativity? —why categories and not semi categories? I wonder why its particular combination of defining properties is more "powerful", deep, and pervasive than another combination of properties.

4

There are 4 best solutions below

1
On BEST ANSWER

Examples!

Remember that most (if not all) abstract structures are motivated by specific examples. And it took a long time for the mathematicians to abstract from these examples and develop an axiomatic framework. Since you have been asking for groups: Permutation groups, Symmetry groups, Lie groups (aka transformation groups) and ideal class groups appeared naturally in the 19th century, even before the general notion of a group was born (Cayley, Galois, Klein, Kronecker, Lie, and many others). There is nothing interesting about the group axioms in themselves, but rather in the fact that they subsume what happens in so many examples, and that we can study many phenomena in specific examples for arbitrary groups. The same remarks apply - even more - to the notion of a category.

Monoids also appear very naturally in many examples. They have a rich theory, quite different from the theory of groups. But in general I would say monoids are harder to understand than groups. For example, whereas finitely generated commutative groups are classified, this is not the case for finitely generated commutative monoids. For this reason one often makes a monoid to a group by formally introducing inverses - this is called the Grothendieck group, which is especially important in K-theory.

Monoids even play a more important rule when we internalize them into arbitrary monoidal categories - this leads to the notion of a monoid object. Monoid objects in $\mathsf{Set}$ are monoids in the usual sense, but monoid objects in $\mathsf{Ab}$ are rings in the usual sense! This offers a considerable overlap between monoid theory and ring theory. In the commutative case, we can even go further and develop algebraic geometry for commutative monoid objects (Toen-Vaquié, Florian Marty).

I haven't worked with loops or near-rings, but I am pretty sure that these aren't covered in most lectures because there are not as many interesting examples as for groups and rings.

The conclusion is very simple: Abstract structures are motivated by specific examples. And this is not restricted to algebra. You could also go ahead and ask "why the union axiom in the definition of a topology?". The answer is the same: Because examples (especially the class of metric spaces) have motivated this axiom. Given a random system of operations and rules between them, you cannot really tell if this is interesting, natural, or not.

2
On

Well, the basic binary operations that we tend to understand more deeply are addition, multiplication of numbers or matrices, composition of functions, concatenation of finite sequences (strings) and intersection, union of sets, say. All these are associative.

Associativity is very natural: it allows one to apply the operation on more than two terms, that is, an associative binary operation generates a unique $n$-ary operation for all $n$.

However, of course, not every operation that occurs is associative, but all these theories must use a lot of brackets (or convention how to place the brackets if omitted). For example, the exponentiation is not associative: $$(x^y)^z\,\ne\,x^{(y^z)}\,.$$ Or the Cayley numbers over the quaternions. See also Lie algebras.

Commutativity combined with associativity also permits you to permute the terms: $$a_1a_2\dots a_n\ =\ a_{\sigma(1)}a_{\sigma(2)}\dots a_{\sigma(n)}$$ where $\sigma$ is a permutation of indices $\{1,2,\dots,n\}$.

Without associativity, it is much more useless. At least, so to say, it is not deeply studied at all, probably because of lack of real motivation.


Let me show instead one more property of operations that has importance in category theory:

Let a binary relation $*$ satisfy the following interchange property $$(a*b)*(c*d)\ =\ (a*c)*(b*d)\,,$$ then the category $\mathcal V$ of such structures is self-enriched, in the sense that the pointwise $*$ operation will map a pair of homomorphism to a homomorphism (just like in the case of Abelian groups).

(Note also that if $*$ has a unit element, then $*$ is already both commutative and associative.)

4
On

This is a very soft question in my opinion, hence I added a the tag.

Here are some - soft- considerations.

The simplest algebraic structures with a binary operation are called magmas, and altough they may not be as popular as groups or rings they have their fans and a relatively large theory, considering how simple their axioms are. Some algebraic structures were studied earlier - and are now more popular - simply because they were used to solve problems in other areas of mathematics or geometry or physics. Think at groups. Later came rings and vector spaces. Modules came only recently, followed by categories and now even more general structures exist.

Categorically all algebras are algebras for a monad so, depending on the monad, you may get different properties. Associativity tends to be a desirable property, probably just because people like to iterate their operations and getting rid of parenthesis is a bonus. The interesting thing is that in physics many operations are indeed associative, thus reinforcing the pure mathematical notion.

Ultimately students and researchers tend to gravitate around tools (algebraic structures, in this case) that seem to be the most effective in solving real world problems, and this includes problems in other areas of mathematics

0
On

Since you already got 3 deep and pervasive answers, I want to concentrate on a single minor point.

I can ask what makes the specific combination of defining properties of a category so great?

In my humble opinion, almost nothing: you will certainly be interested in the work a friend of mine and I (in minor extent) are laying down where we study multi-object partial magmas and recover a great deal of classical "category" theory: you get these things called plots where your composition is not defined for each pair of consecutive arrows, and when it's defined it is possibly non-associative. Finally, you don't have identities everywhere.

"You fool, nothing good can come out of these poorly behaved, thorny things!"

:) not at all.

You can define "isomorphisms" (yes, without identities), and notice that "being an isomorphism" and "admitting an inverse" are different notions in this world, and that they collapse in the categorical world (a category is an associative plot, where the composition is defined and every object has a 1, in the same vein a monoid is an extremely smooth partial magma). You can then define isoids, i.e. plots where every arrow is an isomorphism.

We're even able to define morphisms of plots (p-unctors), natural transformations (trimmings, if I remember well the name Salvatore and I chose), adjoints, limits, and a chain of free-forgetful adjunctions which connects the category (it is a category) of plots to the category of associative plots, semicategories [in this case we've even two different adjunctions for two different fully faithful embeddings], and categories. Other things in the to-do list: what's a $n$-plot? How can one define localization of a plot with respect to a family of arrows? How about simplicial stuff, how about enrichments (whatever this means)?

"You are only children playing with symbols! Examples, Examples!!"

:) Functional analysis and symplectic geometry provide "natural factories" of examples of such structures. For example, one of our two unitization functors applied to the category of symplectic relations gives precisely the Woodward-Wehrheim category.