What is the meaning and purpose of this theorem of inductive definitions?

66 Views Asked by At

I'm in my first analysis class, so naturally we're beginning with developing the real number system. As a part of the discussion of the natural numbers and induction, this theorem has come up in my text:

Given a set X, an element $x_1$ ∈ X, and a sequence {$f_n$} of functions from X to X, there is a unique sequence {$x_n$} in X, beginning with $x_1$, which satisfies $x_{s(n)}$ = $f_n(x_n)$ for all n ∈ ℕ.

Is this theorem just saying that given a base case $x_1$ within some set of statements X and a sequence of "rules" {$f_n$} associated with the cases that follow, that there is a sequence {$x_n$} of cases that satisfy each of their associated "rules?"

I've tried to come up with some very simple, plain English examples to make this digestible, so I thought this might get somewhat close to illustrating this theorem:

Let $x_n \in {X}$ and let $x_n = x_{n+3}$ for all $n \in \Bbb N$

$x_1$ = It's sunny today (We assume this is true for our inductive definition)

$f_1$ = if it's sunny, then it's not raining

$f_2$ = if it's not raining, then I can comfortably wear shorts

$f_3$ = if I can comfortably wear shorts, then it's sunny outside

Then,

$x_2$ = It's not raining today

$x_3$ = I can comfortably wear shorts

So $x_{s(1)} = f_1(x_1) = x_2$ translates into plain English as, "It's sunny outside. If it's sunny outside, then it's not raining. Therefore, it's not raining."

Similarly, $x_{s(2)} = f_2(x_2) = x_3$ translates into "It's not raining today. If it's not raining today, then I can comfortably wear shorts. Therefore, I can comfortably wear shorts.

The condition $x_n = x_{n+3}$ for this trivial example is to satisfy the condition in the theorem that this is true for all $n \in \Bbb N$. Am I totally wrong about what the theorem is saying? I give an element $x_1$ in a set X and a sequence of functions {$f_n$}, and the result is a sequence of {$x_n$} that satisfies $x_{s(n)}$ = $f_n(x_n)$. If my example is a correct application of the theorem, then why is this theorem necessary for induction? And why doesn't it necessarily imply that $x_{s(n)} = x_{n+1}$? If my example isn't a correct application of the theorem, why?

2

There are 2 best solutions below

0
On BEST ANSWER

If we write "$x + 1$" instead of "$s(x)$" for the successor function on $\mathbb{N}$, and if we change indexing of the sequences to start at $0 \in \mathbb{N}$, then the theorem in question is just a disguised form of:

Principle of Primitive Recursion:

If $X$ is a set, if $G : \mathbb{N} \times X \rightarrow X$ is a map, and if $c \in X$, then there exists a unique map $h : \mathbb{N} \rightarrow X$ such that $h(0) = c$ and $h(n + 1) = G\bigl(n, h(n)\bigr)$ for each $n \in \mathbb{N}$.

Given your sequence $(f_{n})_{n \in \mathbb{N}}$ of functions from $X$ to $X$ take $G(n, x) = f_{n}(x)$ and take $c = x_{0}$. Then the sequence that your theorem asserts exists will be given by $x_{n} = h(n)$ for all $n$.

Example:

The simplest and best known application of the result is in defining the factorial function. Take $X = \mathbb{N}$, $c = 1$, and $G(n, x) = (n + 1) x$. [In terms of your theorem, the $n$th function $f_n : \mathbb{N} \rightarrow \mathbb{N}$ is given by $f_{n}(x) = (n + 1) x$.]

Relation to other fundamental principles:

The Principle of Primitive Recursion is a consequence of the "Principle of (Ordinary) Recursion", which starts with a set $X$, an element $c \in X$, and a map $G : X \rightarrow X$, and asserts the existence of a unique map $h : \mathbb{N} \rightarrow X$ such that $h(0) = c$ and $h(n + 1) = G\bigl(h(n)\bigr)$. Note that this more basic principle of recursion cannot directly establish existence of the factorial function (unless you in effect reconstruct the proof of the Principle of Primitive Recursion in the special case involved for the factorial function).

Comment:

Most good texts on set theory will include proofs of both the Principle of Ordinary Recursion and the Principle of Primitive Recursion (although their names are sometimes different). The former is a consequence of the axiom of induction, and some texts will directly prove the Principle of Primitive Recursion from the axiom of induction (and then Ordinary Recursion is an easy corollary of that). Unfortunately, all too many texts, especially at the introductory level, get the justification for existence of the factorial function completely wrong, saying in effect that it's an immediate consequence of Ordinary Recursion — it's not! — if, that is, they really bother to give any justification at all other than utter "hand-waving" that Induction provides the factorial function.

0
On

Your $f$s are not functions, they are propositions. A better three step example would be $f_1(x)=x+1, f_2(x)=x^2, f_3(x)=2x, f_{n+3}(x)=f_n(x)$ It should be the functions that repeat, not the $x$s. The theorem is telling you that if you start with any $x_1$ and successively apply the functions, you will get a unique sequence. So if $x_1=1,$ you would say $x_2=f_1(x_1)=x_1+1=2, x_3=f_2(x_2)=2^2=4,x_4=f_3(x_3)=2\cdot 4=8$ and the sequence becomes $1,2,4,8,9,81,162,163,\ldots$. Working through this example you should become comfortable with the theorem. At each step you just have to apply a function to an argument to get the next step in the sequence. This allows you to define a function $g(x,n)$ which, given the set of $f$s is the value of your sequence after $n$ steps starting with $x$. I have seen it most often applied with all the $f$s being the same, so your sequence is $x, f(x), f(f(x)), f(f(f(x))),\ldots$