Definition of (lazy) conditional in partial recursive functions

220 Views Asked by At

I'm currently working on formalizing the theory of partial recursive functions, and something seems peculiar about the standard definition. The definition of the primitive recursion clause of $\mu$-recursive functions reads as follows (according to Wikipedia):

Given the $k$-ary function $g({\bf x})$ and $k+2$-ary function $h(y,z,{\bf x})$: \begin{align} \rho(g, h) &\stackrel{\mathrm{def}}{=} f \quad\text{where}\\ f(0,{\bf x}) &= g({\bf x}) \\ f(y+1,{\bf x}) &= h(y,f(y,{\bf x}),{\bf x}). \end{align}

I assume that this definition uses eager evaluation, which is to say that $h(y,f(y,{\bf x}),{\bf x})$ is defined only when $f(y,{\bf x})$ is, even if when we evaluate $h$ it turns out to be a projection or otherwise ignores this argument.

Now this evaluation order means that if $f(y,{\bf x})=\bot$ then $f(y+1,{\bf x})=\bot$. But now how can I define a conditional operator like so:

$$\mathrm{ifz}(g,h)(x)\simeq g(x)\quad\mbox{if }x=0$$ $$\mathrm{ifz}(g,h)(x)\simeq h(x)\quad\mbox{if }x\ne 0$$

where $a\simeq b$ means that both sides are undefined, or defined and equal? Without worrying about definedness, $\mathrm{ifz}$ is a trivial application of primitive recursion, but the naive definition will have $\mathrm{ifz}(\bot,0)(1)=\bot$ since it first tries to evaluate the zero branch. Since obviously Turing machines can define an $\mathrm{ifz}$ operator, either my definition is wrong or there is a more clever way to construct it.

Related (but also unanswered): Eager vs. lazy interpretation of recursive functions

2

There are 2 best solutions below

3
On BEST ANSWER

In many treatments of computability theory, primitive recursion is only applied to total functions, and then the $\mu$ operator is also only applied to total functions. Composition is the only operation that needs to be applied to (potentially) partial functions.

This shows up in Kleene's normal form theorem: each partial computable function $f(\bar x)$ can be written in the form $$ f(\bar x) \simeq U(\mu z. T(e,z,\bar x)) $$ for some fixed "index" $e$. Here both $T$ and $U$ are specific total primitive recursive functions, and the only possible way that $f$ could be partial is from the single application of the $\mu$ operator.

So, if $f$ and $g$ are partial computable, how do we find the normal form for the partial function $I(g,h,y,x)$ with $$ I(g,h,y,x) \simeq \begin{cases} g(x) & y = 0, \\ h(x) & y \not = 0.\end{cases} $$

Start by letting $e_g$ be an index of $g$ and $e_h$ be an index of $h$. Let $w(z,y,x) $ be the primitive recursive relation $$ w(z,y,x) = \text{True} \leftrightarrow [ ( y = 0 \land T(e_g, z, x) ) \lor ( y \not = 0 \land T(e_h, z, x)] $$ This function for $w$ is a total primitive recursive function. Let $e_w$ be the appropriate index for $w$. Then we have $$ I(g,h,y,x) \simeq U(\mu z.T(e_w,z,y,x)) $$ as desired. Here we do not need to worry about whether the definition of primitive recursion allows for lazy evaluation because we can build it into the index $e_w$.

5
On

$ifz(g,h)$ can be defined by

$\rho(g,h\circ(P^3_1))\circ(P^1_1,P^1_1)$

Note that evaluation strategy is not a problem here because $ifz$ need only to evaluate its input (is $x=0$ ?) and not the result of $g(0)$ or $h(x)$ to decide what it must do. But of course, if the strategy is not lazy, the case $x>0$ can be problematic. That's a good remark. I usually go with the lazy evaluation when I teach or simulate those functions. Note that the semantic is supposed to be lazy anyway : $P ^2_1(x_1,x_2)=x_1$ even if $x_2=\bot$.