Formal dimensional analysis?

444 Views Asked by At

Dimensional analysis is a method that I know from physics, where quantities are "annotated" with a "dimension". E.g. rather than writing $$4\cdot 5 = 20$$ we write $$4 m \cdot 5 s = 20 m\cdot s$$

The benefit of this is that it allows you to do a sanity check that you are putting the right quantities in the right place. If you put a quantity of a dimension in place where another dimension is expected, you get a "dimension error" (my term).

In this sense dimensional analysis is similar to type theory, where objects are "annotated" with "types", and this allows you to sanity check that you are putting the right objects in the right place: If you put an object of a type where another type is expected, you get a "type error".

Question:

  • Is there a formal relation between type theory and dimensional analysis? Can we model dimensional analysis (as it appears in physics) as a type theory?

  • If not, is there some other formalization in mathematics of dimensional analysis, where we are allowed to multiply and divide quantities and where their dimensions are then also multiplied and divided?

1

There are 1 best solutions below

2
On

Principia Mathematica -- the origin of the term "Type Theory" -- was intended to address this in its final volume. Unfortunately Whitehead abandoned the labor for philosophy, according to Russell, but then Russell did the same. You can get a flavor of where they were heading in the unfinished final volume. See "Relation Arithmetic"* and "Measurement" and "Quantity".

Relation arithmetic's "relation numbers" would have provided the formal basis for a theory of abstract structure of quantitative measurement. This was the point of departure of my efforts to reorient Hewlett Packard's "Internet Chapter 2" toward foundations in computer science because the problem is that deep. There really is no excuse for this state of affairs.

See my essay "Rescuing Computer Science With the Relational Dimensions of the Empirical World" for an informal glimpse into that effort. Unfortunately, the mathematician I hired for that work, Tom Etter (one of the attendees at the 1956 Dartmouth Summer AI Workshop with Ray Solomonoff) passed away shortly after the DotCom bubble burst, the project was shut down and no one with money was interested in supporting this direction of work.

* The only scholar to even attempt to explain why no one has followed up on relation arithmetic, asserts that model theory subsumes it. This is obviously false. The real problem is that relation numbers, as defined by Russell, had a technical flaw, which Etter described in "Relation Arithmetic Revived":

Despite its great promise, relation-arithmetic didn’t get very far. The problem is that the most important combining operators for relations, such as cross product and join, are not invariant under similarity. That is, if A is similar to A’ and B is similar to B’, it does not follow that the cross product or join of A and B is similar to the cross product or join of A’ and B’. This can be seen from a very simple example. Consider a unary relation R with only one tuple (its table has one row and one column). Let the value in that row and column be v. Replacing v by any other value w produces a relation R’ that is similar to R. Now consider the cross product (Cartesian join) RR; it consists of the ordered pair vv. But the cross product RR’ consists of the ordered pair vw, so RR and RR’ are not similar, despite the similarity of their components.

Etter's solution is to provide a context relation within which congruence classes may obtain, in addition to similarity classes.