Personal tools

Chaitin's construction

From HaskellWiki

(Difference between revisions)
Jump to: navigation, search
m (/*To do*/ points to /*Implementation*/ and newly moved - More natural norm functions (from CL terms)
m (``norm function OF ...'' is disambigous here, using ``FROM '' instead)
Line 131: Line 131:
* Making tasks described in [[#Implementation]]
* Making tasks described in [[#Implementation]]
* Making more natural norm functions on CL-terms, see [[#More natural norm functions (from CL terms)]]
* Making more natural norm functions (from CL-terms), see [[#More natural norm functions (from CL terms)]]

Revision as of 14:59, 5 August 2006


1 Introduction

Are there any real numbers which are defined exactly, but cannot be computed? This question leads us to exact real arithmetic, foundations of mathematics and computer science.

See Wikipedia article on Chaitin's construction, referring to e.g.

2 Basing it on combinatory logic

Some more direct relatedness to functional programming: we can base Ω on combinatory logic (instead of a Turing machine).

2.1 Coding

See the prefix coding system described in Binary Lambda Calculus and Combinatory Logic (page 20) written by John Tromp:

\widehat{\mathbf S} \equiv 00
\widehat{\mathbf K} \equiv 01
\widehat{\left(x y\right)} \equiv 1 \widehat x \widehat y

of course, c, d are meta-variables, and also some other notations are changed slightly.

2.2 Decoding

Having seen this, decoding is rather straightforward. Here is a parser for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task.

2.3 Chaitin's construction

Now, Chaitin's construction will be here

\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{-\left|p\right|}


should denote an unary predicate “has normal form” (“terminates”)
should mean an operator “decode” (a function from finite bit sequences to combinatory logic terms)
should denote the set of all finite bit sequences
should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, \left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }
“Absolute value”
should mean the length of a bit sequence (not combinatory logic term evaluation!)

3 Eliminating any concept of code by handling combinatory logic terms directly

We can avoid referring to any code notion, if we transfer (lift) the notion of “length” from bit sequences to combinatory logic terms in an appropriate way. Let us call it the “norm” of the term:

\sum_{p\in\mathrm{CL},\;\mathrm{hnf}\;p} 2^{-\left\Vert p\right\Vert}


\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N
\left\Vert\mathbf K\right\Vert \equiv 2
\left\Vert\mathbf S\right\Vert \equiv 2
\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert

Thus, we have no notions of “bit sequence”,“code”, “coding”, “decoding” at all. But their ghosts still haunt us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.

3.1 More natural norm functions (from CL terms)

Question: If we already move away from the approaches referring to any code concept, then could we define norm in other ways? E.g.

\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N
\left\Vert\mathbf K\right\Vert \equiv 1
\left\Vert\mathbf S\right\Vert \equiv 1
\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert

And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (based on analysis, arithmetic and probability theory). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.

What I really want is to exclude the (IMHO) underestimation of this “probability of termination” number -- an underestimation coming from taking into account the syntactically non-correct codes (IMHO). Thus taking only termination vs nontermination into account, when calculating this number (which can be interpreted as a probability).

4 Implementation

To do: Writing a program in Haskell -- or in combinatory logic:-) -- which could help in making conjectures on combinatory logic-based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work. The analogy:

  • they ask for a maximum limit of iterations, so that they can make a conjecture on convergence of a series;
  • this program will ask for the maximum limit of reducton steps, so that it can make a conjecture on termination (having-normal-form) of a CL term.

Explanation for this: non-termination of each actually examined CL-term cannot be proven by the program, but a good conjecture can be made: if termination does not take place in the given limit of reduction steps, then the actually examined CL-term is regarded as non-terminating.

chaitin --model-of-computation=cl --encoding=tromp --limit-of-reduction-steps=500 --digits=9 --decimal
chaitin --model-of-computation=cl --encoding=direct --limit-of-reduction-steps=500 --digits=9 --decimal

4.1 Term generator

 module CLGen where
 import Generator (gen0)
 import CL (k, s, apply)
 direct :: [CL]
 direct = gen0 apply [s, k]

See combinatory logic term modules here.

 module Generator (gen0) where
 import PreludeExt (cross)
 gen0 :: (a -> a -> a) -> [a] -> [a]
 gen0 f c = gen f c 0
 gen :: (a -> a -> a) -> [a] -> Integer -> [a]
 gen f c n = sizedGen f c n ++ gen f c (succ n)
 sizedGen :: (a -> a -> a) -> [a] -> Integer -> [a]
 sizedGen f c 0 = c
 sizedGen f c (n + 1) = map (uncurry f)
                      concat [sizedGen f c i `cross` sizedGen f c (n - i) | i <- [0..n]]
 module PreludeExt (cross) where
 cross :: [a] -> [a] -> [(a, a)]
 cross xs ys = [(x, y) | x <- xs, y <- ys]

5 Related concepts

6 To do