Personal tools

Chaitin's construction

From HaskellWiki

(Difference between revisions)
Jump to: navigation, search
(Defining concept which do not use the notion of ``code'')
(3 Eliminating any concept of code by handling combinatory logic terms direct: Clarifying the aim of this chapter)
Line 43: Line 43:
 
:should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!)
 
:should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!)
   
== Handling [[combinatory logic]] terms directly ==
+
== Eliminating any concept of code by handling [[combinatory logic]] terms directly ==
   
 
We can avoid referring to any code notion, if we transfer (lift) the noton of “length” from bit sequences to [[combinatory logic]] terms in an appropriate way. Let us call it the “norm” of the term.
 
We can avoid referring to any code notion, if we transfer (lift) the noton of “length” from bit sequences to [[combinatory logic]] terms in an appropriate way. Let us call it the “norm” of the term.
Line 53: Line 53:
 
:<math>\left\Vert\mathbf S\right\Vert \equiv 2</math>
 
:<math>\left\Vert\mathbf S\right\Vert \equiv 2</math>
 
:<math>\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math>
 
:<math>\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math>
  +
  +
Thus, we have no notion of ;“bit ssequence”,“code”, “coding”, “decoding” at all. But its ghost haunts us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.
   
 
Question:
 
Question:
If we move away from the approaches referring to any code concept, then
+
If we already move away from the approaches referring to any code concept, then
 
could we define norm in other ways? E.g.
 
could we define norm in other ways? E.g.
 
:<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math>
 
:<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math>
Line 63: Line 65:
 
And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (analysis or arithmetic). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.
 
And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (analysis or arithmetic). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.
   
What I really want is to exclude the underestimation of this “probability of termination” number -- an underestimation coming from syntactically non-correct codes. Thus taking only termination vs nontermination into account, when calculating this number (which can be interpreted as a probability).
+
What I really want is to exclude the underestimation of this “probability of termination” number -- an underestimation coming from taking into account the syntactically non-correct codes. Thus taking only termination vs nontermination into account, when calculating this number (which can be interpreted as a probability).
   
   

Revision as of 14:13, 4 August 2006

Contents


1 Introduction

Are there any real numbers which are defined exactly, but cannot be computed? This question leads us to exact real arithmetic, foundations of mathematics and computer science.

See Wikipedia article on Chaitin's construction, referring to e.g.

2 Basing it on combinatory logic

Some more direct relatedness to functional programming: we can base Ω on combinatory logic (instead of a Turing machine).

2.1 Coding

See the prefix coding system described in Binary Lambda Calculus and Combinatory Logic (page 20) written by John Tromp:

\widehat{\mathbf S} \equiv 00
\widehat{\mathbf K} \equiv 01
\widehat{\left(x y\right)} \equiv 1 \widehat x \widehat y

of course, c, d are meta-variables, and also some other notations are changed slightly.

2.2 Decoding

Having seen this, decoding is rather straightforward. Here is a parser for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task.

2.3 Chaitin's construction

Now, Chaitin's construction will be here

\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{-\left|p\right|}

where

hnf
should denote an unary predicate “has normal form” (“terminates”)
dc
should mean an operator “decode” (a function from finite bit sequences to combinatory logic terms)
2\!\;^{*}
should denote the set of all finite bit sequences
Domdc
should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, \left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }
“Absolute value”
should mean the length of a bit sequence (not combinatory logic term evaluation!)

3 Eliminating any concept of code by handling combinatory logic terms directly

We can avoid referring to any code notion, if we transfer (lift) the noton of “length” from bit sequences to combinatory logic terms in an appropriate way. Let us call it the “norm” of the term.

\sum_{p\in\mathrm{CL},\;\mathrm{hnf}\;p} 2^{-\left\Vert p\right\Vert}

where

\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N
\left\Vert\mathbf K\right\Vert \equiv 2
\left\Vert\mathbf S\right\Vert \equiv 2
\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert

Thus, we have no notion of ;“bit ssequence”,“code”, “coding”, “decoding” at all. But its ghost haunts us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.

Question: If we already move away from the approaches referring to any code concept, then could we define norm in other ways? E.g.

\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N
\left\Vert\mathbf K\right\Vert \equiv 1
\left\Vert\mathbf S\right\Vert \equiv 1
\left\Vert\left(x\;y\right)\right\Vert \equiv 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert

And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (analysis or arithmetic). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.

What I really want is to exclude the underestimation of this “probability of termination” number -- an underestimation coming from taking into account the syntactically non-correct codes. Thus taking only termination vs nontermination into account, when calculating this number (which can be interpreted as a probability).


3.1 Term generator

 module CLGen where
 
 import Generator (gen0)
 import CL (k, s, apply)
 
 direct :: [CL]
 direct = gen0 apply [s, k]
 module Generator (gen0) where
 
 import PreludeExt (cross)
 
 gen0 :: (a -> a -> a) -> [a] -> [a]
 gen0 f c = gen f c 0
 
 gen :: (a -> a -> a) -> [a] -> Integer -> [a]
 gen f c n = sizedGen f c n ++ gen f c (succ n)
 
 sizedGen :: (a -> a -> a) -> [a] -> Integer -> [a]
 sizedGen f c 0 = c
 sizedGen f c (n + 1) = map (uncurry f) $ concat [sizedGen f c i `cross` sizedGen f c (n - i) | i <- [0..n]]

4 Related concepts

5 To do

Writing a program in Haskell -- or in combinatory logic:-) -- which could help in making conjectures on combinatory logic-based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work: it would ask for a maximum limit of iterations.

chaitin --computation=cl --coding=tromp --limit-of-iterations=5000 --digits=10 --decimal