Computable functions on sets other than the integers?

57 Views Asked by At

I have recently started reading about computability of functions and am a little surprised that every mathematical definition of computable function seems to only talk about functions $f : \mathbb{N} \rightarrow \mathbb{N}$.

Has anyone formulated a mathematically rigorous definition of computable functions on more general sets (e.g. integers, rational numbers, $\mathbb{Z} \times \{1,2,...,n\}$, etc)? I realize definitions of computable functions based on the ideas of recursion would require some technical adjustment to accommodate such sets.

Obviously, there is a bijection between any two countable sets. However, as their are infinite subsets of $\mathbb{N}$ which are not decidable/computable/recursive, then we can't immediately extend our definition of computable functions to arbitrary countable sets. The infinite sets would need to be able to be "recursively defined" somehow. However, this requires the previous mentioned technicalities to define recursive (computable) functions on or to sets other than $\mathbb{N}$.

References to good texts or articles discussing such extensions would be great!

1

There are 1 best solutions below

0
On

Indeed. You can use any set which has an effective symbolic representation.

The problem with doing that in full generality is that you'd need a rigorous definition of "effective symbolic representation". It's not that there's any real disagreement what that means, but the only technically workable way I can imagine defining it is to declare by fiat that some canonical set is effectively symbolic and say that an effectively symbolic representation of a different set is an "effective" injection into your canonical set.

But then what we're really doing is just computability theory on that canonical set anyway, just with more words surrounding it! All we can do is vary our choice of canonical set -- but all the reasonable choices are equivalent anyway.

As you've noticed, mathematical logicians like using $\mathbb N$ as the canonical sets.

If you ask a computer scientist, they will be somewhat more likely to consider the set of finite bit strings (or possibly finite strings over some unspecified finite alphabet) to be canonical. If they're at the "engineering" end of computer science they may even prefer byte strings, though this may be considered somewhat facetious.

As a randomish text suggestion, Neil Jones (who was my Ph.D. advisor) considered the set of lisp-like data items to be technically superior to either of these, and wrote a book developing the standard theory in that setting.