Chaitin's constant and coding undecidable propositions in a number

246 Views Asked by At

Chaitin's constant $\Omega$ is a constant whose digits are defined to encode the solution to the halting problem which is undecidable and hence $\Omega$ is uncomputable. I am not particularly familiar with the field of algorithmic information theory, and I have a question about a statement which I once read (by a physicist) in a paper by David Deutsch:

If the dynamics of some physical system did depend on a function not in $C(T)$ [the set of computable functions], then that system could in principle be used to compute the function. Chaitin (1977) has shown how the truth values of all ‘interesting’ non-Turing decidable propositions of a given formal system might be tabulated very efficiently in the first few significant digits of a single physical constant [Emphasis mine].

I have read the paper referenced, and I think I understand (a fair bit of) it, but I still do not understand exactly where the italicized statement comes from; I could not seem to find relevant results in the paper. Would anybody be able to explain exactly where the statement that all of a formal system's ‘interesting’ undecidable propositions can be efficiently tabulated in a real number's first few significant digits comes from, how such tabulation would proceed, and what (if any) is the relation of this to $\Omega$ and Chaitin's work?

1

There are 1 best solutions below

9
On BEST ANSWER

Expanding on my comments:

In a prefix-free coding of first-order arithmetic, instead of writing $\exists\ y\ .\ y*y=x$, we would write something like $\exists\ y\ .\ = * \ y \ y \ x$. It's also convenient to think about a prefix-free binary language, so assign a binary word to every symbol, for example $\exists \rightarrow 0000$, $x \rightarrow 0001$, etc. in either a fixed-width or prefix-free way so that the symbol boundaries can be decoded.

Now let $L$ be such a binary prefix-free language encoding the formulas of arithmetic and define $\Omega = \sum_{x \in L : \text{Pr}(x)}{2^{-\vert x \vert}}$ where $\text{Pr}$ is the provability predicate for $L$. $\Omega$ is the probability that an infinite random bit string starts with a theorem.

Given the first $n$ bits of $\Omega$, $\Omega_n = \lfloor \Omega \cdot 2^n \rfloor \cdot 2^{-n}$, we can decide $\text{Pr}(x)$ for any formula $x$ with $\vert x \vert \le n$. What we do is begin a process of proving all theorems of arithmetic starting with the axioms, keeping track of the total contribution to $\Omega$ from every theorem proved so far. When that value becomes greater than or equal to $\Omega_n$, which it must eventually, all theorems not longer than $n$ bits have been proved. At that point we can determine $\text{Pr}(x)$ for any formula $n$ bits or shorter.

It's fun to imagine $\Omega$ appearing as a dimensionless physical constant measurable to thousands of digits, or perhaps engraved on a monolith buried on the moon alongside the axioms of ZFC. I think it is certainly a paradox that mathematical truth is so "compressible". One way out is to say that there's no way we can really know that the $n$-bit string written on the monolith really is $\Omega_n$ (supposing even that there is other writing that claims it is equal to $\Omega$, that it was created by the gods, etc.). There is also the point that it's probably computationally infeasible to wait until all short theorems are proved, so it would be useless. One thing we might get out of it is a situation like where there are $5$ open problems (formulas that are not known to be theorems and not known not to be) expressible in $n$ bits; after choosing to believe in a particular $\Omega_n$, we might then "know" that exactly $3$ of them are provable, and $2$ are not, but that by itself might get us no closer to discovering which are which.