Generalising the recursion theorem to well-founded posets

489 Views Asked by At

The recursion theorem states the following.

Suppose $(A,\preccurlyeq)$ is a chain with order type $\omega$. Let $X$ be any set, let $x\in X$ and let $g\colon X\times A \to X$ be a function. Then there exists a unique function $f\colon A\to X$ such that $$f(a) = \begin{cases} \hfil x\hfil & \text{if $a=\min A$}\\ g(f(t),a) & \text{if $a=\operatorname{succ}(t)$}. \end{cases} $$

(E.g., for the factorial function, we put $A=\mathbb N=X$, $x=1$ and $g(m,n)=mn$.)

Now if we instead have a poset $(A,\preccurlyeq)$ which is well-founded, we can partition it so that the classes of the partition form a chain which has order type $\omega$. Indeed, define the idea of a "minimal set" as follows.

Let $(A,\preccurlyeq)$ be a poset. The minimal set of a subset $B\subseteq A$, denoted $M_\preccurlyeq(B)$ or just $M(B)$, is the set defined by $$M(B):= \{b\in B : \text{there is no $m\in B$ such that $m\preccurlyeq b$}\}.$$

Then the desired partition of $A$ is \begin{align*} A_1 &= M(A)\\ A_2 &= M(A\smallsetminus A_1) \\ A_3 &= M(A\smallsetminus (A_1\cup A_2)) \\ &\kern6pt\vdots\\ A_{n+1} &= M(A\smallsetminus(A_1\cup \cdots\cup A_n))\\ &\kern6pt\vdots \end{align*} My question is: can we state a version of the recursion theorem for this set $A$, according to this partial order?

For instance, an example which comes to mind is the binomial coefficients, $$c(n,k) = \begin{cases} \hfil 1 \hfil & \text{if $(n,k)\in A_1$}\\ \hfil c(n-1,k-1) + c(n-1,k) & \text{otherwise}. \end{cases}$$ Here we have $A=\{(n,m)\in\mathbb N^2 : m\leqslant n\}$, and we are ordering it in the following way:

  • for any $n\in\mathbb N$, $(n,0) \preccurlyeq a$ for all $a\in A$
  • for any $n\in\mathbb N$, $(n,n) \preccurlyeq a$ for all $a\in A$
  • for any $n,k,\ell\in\mathbb N$, $(n,k) \preccurlyeq (n+1,\ell)$.

This captures the natural order implicit in Pascal's triangle:

enter image description here

Note that $A_1$ contains precisely base cases, and then the rest of the sets are such that we can define $c(n,k)$ at the $A_k$th level using its values at the previous levels.

Another example is remainder upon division:

$$r(n,d) = \begin{cases} \hfil n \hfil & \text{if $0\leqslant n\leqslant d$}\\ r(n-d,d) & \text{if $n > d$}\\ r(n+d,d) & \text{if $n < 0$} \end{cases}$$

Here the set is $A=\mathbb Z\times \mathbb N$, and the order is

  • $(n,d)\preccurlyeq a$ if $0\leqslant n\leqslant d$ for any $d$
  • $(n,d)\preccurlyeq (x,y)$ if $|ny|\leqslant|xd|$.

If it is to be formulated similarly, I was thinking the theorem would look something like this:

Suppose $(A,\preccurlyeq)$ is a poset where $\preccurlyeq$ is well-founded, and $\{A_1, A_2,\dots\}$ is the partition of "minimal sets" induced by $\preccurlyeq$. Let $X$ be any set, let $x_a\in X$ for each $a\in A_1$, and let $g\colon A\times X^? \to X$ be a function. Then there exists a unique function $f\colon A\to X$ such that $$f(a) = \begin{cases} \hfil x_a\hfil & \text{if $a\in A_1$}\\ g(a, f(a_1),f(a_2),\dots) & \text{if $a\in A_k$ and $A_1\cup\cdots\cup A_{k-1} = \{a_1,a_2,\dots\},$} \end{cases}$$

i.e., the base cases are all for $a\in A_1$, and the inductive case is allowed to use the value of $f$ at any of the $a$ in the previous sets $A_1,\dots,A_{k-1}$. But this formulation doesn't feel as succinct as it could be, and also the arity of $g$ varies depending on which $A_k$ our $a$ lies in. It also doesn't feel like it captures cleanly the examples I had in mind; I can't clearly tell you what $g$ to take.

I appreciate any help with formulating this theorem. I want to have a general theorem which justifies "definition by recursion" for any well-founded poset.

1

There are 1 best solutions below

18
On BEST ANSWER

There is no need to get a total order associated to your partial order to make sense of recursion. Indeed, you don't even need to have a partial order! Here is a simple way to formulate recursion with respect to any well-founded relation.

Theorem: Let $X$ be a set and let $<$ be a well-founded relation on $X$ (i.e., for any nonempty subset $A\subseteq X$, there exists $a\in A$ such that $b\not< a$ for all $b\in A$). Let $F:X\times V\to V$ be any (class) function. Then there exists a unique function $f$ on $X$ such that for each $x\in X$, $f(x)=F(x,f|_{I(x)})$ where $I(x)=\{y\in X:y<x\}$.

In other words, if you have a rule for defining $f(x)$ given $x$ and the values of $f(y)$ for all $y<x$, then this defines a unique function on all of $X$. For instance, you can use this on Pascal's triangle with $X=\{(a,b)\in\mathbb{N}^2:b\leq a\}$ where you say $(a,b)<(c,d)$ iff $c=a+1$ and $d$ is either $b$ or $b+1$ (i.e., $(a,b)$ is one of the two spots above $(c,d)$ in the triangle). The rule $F$ would then be to define $f(0,0)=1$, and otherwise define $f(c,d)$ to be the sum of all the values $f(a,b)$ such that $(a,b)<(c,d)$.

Proof of Theorem: Say that a subset $I\subseteq X$ is an initial segment if $x\in I$ and $y<x$ implies $y\in I$. Note that if $I$ is an initial segment of $X$, then there exists at most one function $f$ on $I$ satisfying $f(x)=F(x,f|_{I(x)})$ for each $x\in I$ (given two such functions, by well-foundedness there is a minimal element on which they disagree, which then gives a contradiction). Call such a function good. Now let $S$ be the set of all such good functions defined on initial segments of $X$ (this is a set by Replacement), and let $f=\bigcup S$. Note that if $g,h\in S$ then $g$ and $h$ agree on $\operatorname{dom}(g)\cap\operatorname{dom}(h)$ since there is at most one good function on that set, and so $f$ is a function. The domain of $f$ is an initial segment of $X$ (since it is a union of initial segments), and $f$ is good since if $x\in\operatorname{dom}(f)$, there is some $g\in S$ such that $x\in\operatorname{dom}(g)$ and then $f$ and $g$ agree on $I(x)$ and also on $x$.

To complete the proof, it remains to be shown that $f$ is in fact defined on all of $X$. Suppose it is not; then there is a minimal element $x\in X$ where $f$ is not defined. This means $f$ is defined on $I(x)$. But now we can extend $f$ to a good function $f'$ on $\operatorname{dom}(f)\cup\{x\}$ by defining $f'(x)=F(x,f|_{I(x)})$. This $f'$ is then in $S$, contradicting the definition of $f$. Thus $f$ must be defined on all of $X$.

That said, your idea of partitioning your set is a natural and useful one, and is called the rank with respect to a well-founded relation. Here is one way to define it.

Definition: Let $X$ be a set and let $<$ be a well-founded relation on $X$. We then recursively define a function $\operatorname{rank}:X\to Ord$ as follows: $\operatorname{rank}(x)=\sup\{\operatorname{rank}(y)+1:y<x\}$. (Note that such a recursive definition makes sense using the Theorem above.)

So, $\operatorname{rank}(x)=0$ if $x$ is minimal, $\operatorname{rank}(x)=1$ if $x$ is not minimal but all its predecessors are minimal, and so on. Your sets $A_k$ are exactly the elements of rank $k$. In general, though, the rank of an element may not be finite (in other words, you would need to continue your process transfinitely in order to exhaust your set). For instance, if $X$ is well-ordered, then the rank function is just the unique isomorphism from $X$ to an ordinal.

Now, how can you use rank to make recursive definitions? Well, just observe that if you define $x\prec y$ to mean that $\operatorname{rank}(x)<\operatorname{rank}(y)$, then $\prec$ is also a well-founded relation on $X$ (indeed, this uses only the fact that $\operatorname{rank}$ is some function to the ordinals!). So, using the Theorem, you can recursively define functions on $X$ by defining $f(x)$ in terms of the values of $f(y)$ for all $y$ of lower rank than $x$.