Definition of computability of real numbers?

271 Views Asked by At

What exactly does it mean to say that a real number $x$ is computable? I can think of two reasonable definitions but I am not sure whether or not they are equivalent:

1) There is an algorithm which given $n$ outputs the $n^{\text{th}}$ bit of the binary representation of $x$.

2) There is an algorithm which given $n$ outputs a rational number $q$ with $\vert{q-x}\vert \le \frac{1}{n}$.

The troublesome case is the following: Let $z$ be the reciprocal of the length of the shortest proof of falsehood in ZFC if ZFC is inconsistent, else $0$ if ZFC is consistent, and let $x = 1 - z$. So either $x=1$ or $x = 0.1111...0...$. Seemingly the latter definition is satisfied but not the former. In the latter case, the program can spend some time hunting for a contradiction in ZFC and then output a sufficiently precise approximation whether or not a contradiction is found. In the former case, we cannot even determine the first bit, unless ZFC is actually inconsistent. Is this reasoning that the definitions are not equivalent correct? If so, what definition (perhaps neither) is standard, and how can the argument be stated more clearly? If the definitions are in fact equivalent, why?

3

There are 3 best solutions below

4
On BEST ANSWER

The number you describe is a rational. Thus it is certainly computable, by either of your definitions. The fact that we don't know which computation outputs approximations to $x$ is not relevant.

0
On

The reasoning is not correct. If ZFC is consistent, there is an algorithm that outputs $x$, namely "output $1$". It's just that we don't happen to know that this algorithm is correct.

0
On

The other answers address your particular example, but not the equivalence of the two representations.

The base-2 representation of real numbers and the Cauchy representation (your second representation) do indeed define the same set of computable numbers.

However, the functions computable on these representations differ (multiplication by 3 is not computable in the base-2 representation -- Turing originally noted this for base-10).