Abstract Algebra and Parallel Computing

559 Views Asked by At

I've recently been learning a bit about parallel computation. Two things I recently learned about are Reduce and Scan.

Where Reduce is defined as 2-Tuple of a Set of elements and a binary operation that is associative.

Scan is defined as a 2-Tuple of a set of elements and a binary operation that is associative and that this pair has an identity.

So a Scan is essentially a Group. Now I haven't looked too much further into this, but I was wondering if any of you might know if this idea that parallel computation works on sets of information with binary (or really any associative function) operations and applied principles of abstract algebra to it?

3

There are 3 best solutions below

0
On BEST ANSWER

Reduce qualifies as a semigroup, and Scan is a monoid (a monoid is a semi-group with an identity).

Neither is a group, which is, essentially, a monoid in which the inverse of every element in the "monoid" is also in the "monoid".

Take a peek at the linked entries in Wikipedia to get a feel for the properties shared by semigroups, and those shared by monoids.

0
On

I think, you may be interested to look at approaches to formal modelling of concurrent systems known as process calculus or process algebras. See, for example, "A brief history of process algebra" (J.C.M. Baeten, Theoretical Computer Science, Volume 335, Issues 2–3, 23 May 2005, Pages 131–146).

0
On

Perhaps not surprisingly, this topic comes up once every so often in the Haskell world. I'm not keeping up, but I recently came across this blog entry where an optimization to a machine learning algorithm is described using a monoid, a group and a vector space.

You might also find this stackoverflow answer interesting.

I'm racking my brain to see if I can remember the right keywords to find a paper presented to a CS conference a couple of years ago with this subject; hopefully I'll edit it in later.