Axiomatization of reducibility notions

183 Views Asked by At

In computability theory, there are many notions of reducibility (such as Turing reducibility, enumeration reducibility, many-one reducibility, etc.). I am curious—has there been any attempt to axiomatize a notion of "abstract reducibility"; i.e., has anyone laid out a set of axioms that make an arbitrary preorder $\leq$ on $\mathcal{P}(\mathbb{N})$ a notion of "reduction"?

For example, if $\leq$ is to be a reducibility notion, we might want the following as axioms:

  • $\forall X \ [\emptyset, \mathbb{N} \leq X]$
  • $\forall X \ [X \leq X']$, where $X'$ denotes the Turing jump of $X$

If no one has written such a set of axioms, what might be some challenges making the informal notion of reducibility hard to axiomatize as a kind of preorder on $\mathcal{P}(\mathbb{N})$?

2

There are 2 best solutions below

0
On

The closest things I've been able to find so far are "jump upper semilattices," although they're not exactly what I was looking for for a couple of reasons.

A jump upper semilattice is a partial order equipped with a binary operation $\cup$ and a unary operation $\mathfrak{j}$ such that $\cup$ behaves like the join does on the Turing degrees and $\mathfrak{j}$ behaves like the Turing jump does on the Turing degrees. (See the link above for a precise definition.)

In one sense, this notion is more general than I was looking for, since a jump upper semilattice does not necessarily have to be a structure on subsets of $\mathcal{P}(\mathbb{N})$, and since the ordering in a jump upper semilattice does not necessarily have to reflect any kind of computation.

Additionally, not every notion of reducibility forms a jump upper semilattice with the $\cup$ being the usual join and $\mathfrak{j}$ being the Turing jump. For example, $\mathfrak{j}$ must be strictly increasing in a jump upper semilattice, and the Turing jump is not strictly increasing on the enumeration degrees. However, there is a different operator, the enumeration jump, that is strictly increasing on the enumeration degrees. So, the enumeration degrees can form a jump upper semilattice, just not with the Turing jump as the jump operator.

Now, I am curious about a related question—does every notion of reduction form a jump upper semilattice with some "natural jump operation" as its jump and some "natural join operation" as its join, and can we axiomatize what "natural jump and join operations" should be?

0
On

Not really an answer, more like a long comment: I think it is hard to come with a satisfactory answer. First of all, you are talking about reducibility notions on $\mathcal{P}(\mathbb{N})$, but in computability theory, people have studied also notions of reducibility on higher order objects, like subsets of $\mathcal{P}(\mathbb{N})$.

Moreover, all the reducibility notions studied capture some sort of "computation". If you strip away the actual way a set $A$ is computed from a set $B$, then you're left with just a quasi-order on sets (and, in turn, a partial order on equivalence classes), which may not be super interesting. People in category theory have studied partial combinatory algebras. I leave it to you to decide whether this is a satisfactory generalization of the notion of computation.

For the jump, while it is true that the Turing jump plays a fundamental role in the study of the Turing degrees, there are other structures for which I'm not aware of any "natural" notion of jump. The mere existence of an abstract jump can be easily proved in any upper semilattice using the axiom of choice. So then the problem would be "what would you consider natural or satisfactory as a notion of jump"?

I would say (and again these are just some personal thoughts, not really a proper answer) that an abstract notion of computable reducibility features:

  • a model of computation (what does it mean to compute your set/function?)
  • an "oracle call", that is if you want $f\le g$ you need to tell me how can $g$ help the computation of $f$.

Typically you may go like: you start from an input for $f$, produce an input for $g$, apply $g$, use the answer to continue your computation of $f$. And then there all possible kind of variations: maybe producing an input for $g$ given an input for $f$ need not be computable. Or maybe it can be computable but not uniformly so (different inputs may require different functionals). Or maybe you have constraints on the number of times you can call the oracle $g$. Or maybe when you apply $g$ you lose access to the original input for $f$. And so on. Basically, most if not all of these variation have a name in the literature.