How do you prove the existence of the addition function without pre-supposing it?

173 Views Asked by At

Context: self-study from Smullyan and Fitting's Set Theory and the Continuum Problem (revised ed., 2010).

So I have this question, which is exercise 8.4 (a) in Chapter 3 (page 44 of Dover edition).

Prove that there is a unique function $A(x,y)$ from $\omega \times \omega$ into $\omega$ such that for any natural numbers $x$ and $y$, $A(x,0) = x$ and $A(x, y^+) = A(x, y)^+$. (This is the usual addition function. Writing $x + y$ for $A(x, y)$, we have $x + 0 = x$ and $x + y^+ = (x + y)^+$.)

(In the above:

  • $\omega$ is the von Neumann construction of natural numbers: $0 \equiv \varnothing$, $n \equiv \{x: x \subsetneqq n\}$
  • $n^+$ is the successor function: $n^+ = n \cup \{n\}$, from which the existence of the addition function will then propter hoc allow one to identity $n^+$ with $n + 1$. Obviously we can't at this stage talk about $n + 1$ because we haven't defined it quite yet. )

From the structure of Smullyan and Fitting, I am led to believe that I ought to be using the following "strengthened" version of the Principle of Recursive Definition (PRD henceforth), which goes:

For any class $A$ and any element $c$ in $A$ and any function $g(x, y)$ from $\omega \times A$ into $A$, there is a unique function $f$ from $\omega$ into $A$ such that $f(0) = c$ and for any number $n$, $f(n^+) = g(n, f(n))$.

Notwithstanding the fact that reusing $A$ from its definition as a general class to the name of the function we are trying to prove the existence of is confusing, it is difficult to see which of $f$ and $g$ is analogous to the function $A$ we are trying to prove the existence of.

I'm trying all sorts of random stuff.

First I try this:

For any given $x \in \omega$, $f(y)$ can be defined as $A(x, y)$ such that:

$f(0) = A(x, 0) = x + 0 = x$ and $f (y^+) = A(x, f(y)) = A(x, A(x, y))$

but that is obviously completely wrong because it doesn't give the right function, and I'm presupposing the existence of the $A$ I'm trying to establish the existence of. Clearly I'm on the wrong track.

So, suppose we let this $g$ function be defined as: $g(x, y) = \begin{cases} x & : y = 0 \\ g(x, n)^+ & : y = n^+ \end {cases}$

That is, define $g$ to be the function $A$ that we are trying to establish the existence of. In this context, $f$ is the successor mapping itself.

But again, a) that presupposes the existence of $A$, and $b$ it is not clear what $g$ actually is, because it uses $g$ to define itself

According to the rubric of the PRD above, "for ... any function $g(x, y)$ from $\omega \times A$ into $A$". In this case of course (the class) $A$ is identified with $\omega$, so it's $g(x, y)$ from $\omega \times \omega$ into $\omega$ that we want to start with.

It is of course straightforward but tedious to set up an inductive argument based on the PRD, but that is effectively "proving PRD all over again" and such seems futile.

1

There are 1 best solutions below

7
On

Prove that there is a unique function $A(x,y)$…such that for any natural numbers $x$ and $y$, $A(x,0)=x$ and $A(x,y^+)=A(x,y)^+$.

Fix $x_0\in\omega$. We want to apply the PRD to the as-yet-undefined function $y\mapsto A(x_0,y)$, so recall the PRD:

…there is a unique function $f$…such that $f(0)=c$ and for any number $n$, $f(n^+)=g(n,f(n))$.

Comparing the left-hand sides, we see that we want $f(y)=A(x_0,y)$. The PRD then should produce $$f(0)=A(x_0,0)=x_0$$ and $$g(n,A(x_0,n))=g(n,f(n))=f(n^+)=A(x_0,n^+)$$ Comparing to our original claim, we see that we want to choose $c=x_0$ and $g(n,z)=z^+$.

Then the PRD implies that there is a unique function $y\mapsto A(x_0,y)$ satisfying the desired conditions at $x_0$.

Since we did not rely on the value of $x_0$ in this proof, we can generalize to arbitrary values; for each $x$, there is a unique function $y\mapsto A(x,y)$ satisfying the desired conditions at $x$. But if there exists a class $\mathcal{C}$ such that, for all $p\in X$, there is a unique object $c_p\in\mathcal{C}$, then there exists a unique function $X\to\mathcal{C}$, viz. $p\mapsto c_p$.

Thus there exists a unique function $x\mapsto(y\mapsto A(x,y))\in\omega\to(\omega\to\omega)$; undoing the currying, $A\in\omega^2\to\omega$. But there is still a technical detail left in the quantifiers of our desired conditions.

Notice that our desired conditions do not place any "compatibility" relations on $A(x,y)$ as we vary $x$; in the first condition $A(x,0)$ appears once, and in the second, $A(x,y^+)$ and $A(x,y)$ have the same "$x$-argument". So if

for every $x$, $A$ satisfies the desired conditions at $x$,

holds, then

$A$ satisfies the desired conditions at every $x$

because those conditions are the same thing across the two statements.