What proof proves that given any number, when dividing by two recursively, you will always reach 1?

72 Views Asked by At

Let us imagine you have a function that looks like this:

def div(n):
    if (n==0):
        return 0
    elif n == 2:
        return 1
    elif n % 2 == 0:
        return div(n/2)
    else:
        return div(n + 1)

Where you essentially keep dividing the number by 2 if it is even or adding 1 to it when it is odd with the goal of returning 1 if you can truly reach it, or 0 if the number is zero.

Does there exist a proof that ascertains the termination of this recursive function?

2

There are 2 best solutions below

0
On BEST ANSWER

It's a straightforward proof by induction or infinite descent.

Here's one way to do it. Let (for $x$ a positive integer) $f(x)={x\over 2}$ if $x$ is even and ${x+1\over 2}$ if $x$ is odd. Then it's easy to check that $f(x)$ is always again a positive integer and that $f(x)<x$ unless $x=1$. Now let $x$ be arbitrary and consider the sequence $$x,f(x),f(f(x)), ...$$ By the point above, this is either an infinite descending sequence of positive integers - which can't happen - or eventually hits $1$.

(Note that it's crucial that $f$ be decreasing everywhere except $1$ - so this simple idea won't give a proof of the Collatz conjecture. And indeed there are "Collatz-like" functions which are known to have "wild" - in particular, $1$-avoiding - behavior.)

0
On

You could build a binary tree from 1, N*2 or N*2+1 recursively. It should be possible to prove that reading such a tree row by row will continue 1,2,3,4,5... containing all integers. binary tree