Is there a general answer to the question of why certain notational systems are better than others?

156 Views Asked by At

The question is unfortunately a little vague. It's a pretty mundane fact that certain notational systems are better than others for certain tasks. My informal sense of why this is the case is just that certain calculations which have some kind of interest to us can be solved by some algorithms which are particularly fast/easy to implement in one notational system, but slow and inefficient in the other.

Is there a way to state this more precisely? In particular, I'm wondering what the accepted tools are for asking questions about how well-suited a notational system is for representing it's intended domain, and if there's some kind of quantitative measure we can use to compare competing systems (presumably relative to some specified task). And in an even more general sense, can we say anything insightful about the mathematical relationship between a notational system and the structures it talks about?

2

There are 2 best solutions below

0
On BEST ANSWER

I think that complexity theory has some useful concepts that help us formalize the problem.

The first thing that comes to mind is that any problem of the kind that involves natural numbers as inputs takes an exponential boost in complexity when we restrict the inputs to be in unary.

For example, we have that while factoring of binary numbers is suspected to be infeasible in poly-time, the factorization of unary numbers is trivially in the $P$ class.

Of course, there is a big problem here, which is that we are artificially inflating the input size to 'fool' the complexity metric we are using. In real life, you cannot factorize big numbers in poly-time because you first would have to expand them to unary form, which takes exponential time.

This looks discouraging and counterintuitive. After all, if we are hoping to find a proper handle of how encoding affects complexity we would expect a metric which rewards clever encodings and punishes stupid ones. But in this case we have that the less insightful notation (unary) gets boosted in comparison with the clever logarithmic one.

But we can get a bit deeper in the rabbit hole of complexity theory to see what else is there.

In computability theory we have the well known Church Turing Thesis, which states that every computational model is reducible to Turing computability.

Complexity theorists came up with an strengthening of this thesis, the Strong Church Turing Thesis, which states that every computational model is reducible to Turing computability in polynomial time.

In other words, it does not matter how you encode or manipulate information; as long as your model is realistic then it will be efficiently reducible to the encoding used by a simplistic TM.

...except that we currently believe that the Strong CT thesis is false.

Quantum computing has turned our to be a weird computational model which we have not been able to figure out how to reduce to the classical case in an efficient way. And in fact many researchers believe it is not possible.

This means that the encoding using qubits of a problem is more efficient than the classical one, suggesting that computational complexity is not as bad a metric as we thought.

tl;dr - computational complexity is one such tool, thought the problem is far from being solved.

0
On

Since the standard division algorithm (https://en.wikipedia.org/wiki/Long_division) is polynomial (with respect to the number of digits) the base cannot play crucial role (except for unary notation) - if you had any algorithm working well in system with specific base $k$, you could easily transform the input to that notation:

Until n==0:

1) compute the r = n mod k (using standard "long division")

2) divide n-r by k to obtain new n

The sequence of produced r's is desired number written in base $k$.

The complexity is polynomial (in fact - low; it's not $O(n^{1000000})$).

The unary notation is obviously worse - it uses exponentially more space to store the same number as any other notation.