Chaitin's construction
From HaskellWiki
EndreyMark (Talk  contribs) m (Small corrections) 
EndreyMark (Talk  contribs) (``Workpage template'', warning about missing substantial details) 

(39 intermediate revisions by one user not shown)  
Line 1:  Line 1:  
+  :'''''Correction in process. There is a substantial point that is lacking yet, the formulae and the concepts are not correct without it.''''' 

+  
__TOC__ 
__TOC__ 

Line 4:  Line 6:  
Are there any real numbers which are defined exactly, but cannot be computed? 
Are there any real numbers which are defined exactly, but cannot be computed? 

−  This question leads us to [[exact real arithmetic]], foundations of [[mathematics]] and [[computer science]]. 
+  This question leads us to [[exact real arithmetic]], and [[algorithmic information theory]], and foundations of [[mathematics]] and [[computer science]]. 
See Wikipedia article on [http://en.wikipedia.org/wiki/Chaitin%27s_constant Chaitin's construction], referring to e.g. 
See Wikipedia article on [http://en.wikipedia.org/wiki/Chaitin%27s_constant Chaitin's construction], referring to e.g. 

Line 25:  Line 27:  
Having seen this, decoding is rather straightforward. 
Having seen this, decoding is rather straightforward. 

−  Let us represent it e.g with the following LL1 parser. Of course, we can build it on top of more sophisticated parser libraries (Parsec, arrow parsers) 
+  [[/ParserHere is a parser]] for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task. 
−  ==== Decoding module ==== 
+  === Chaitin's construction === 
−  <haskell> 
+  Now, Chaitin's construction will be here 
−  module Decode (clP) where 
+  :<math>\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{\leftp\right}</math> 
+  where 

+  ;<math>\mathrm{hnf}</math> 

+  :should denote an unary predicate “has normal form” (“terminates”) 

+  ;<math>\mathrm{dc}</math> 

+  :should mean an operator “decode” (a function from finite bit sequences to [[combinatory logic]] terms) 

+  ;<math>2\!\;^{*}</math> 

+  :should denote the set of all finite bit sequences 

+  ;<math>\mathrm{Dom}_\mathrm{dc}</math> 

+  :should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, <math>\left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }</math> 

+  ;“Absolute value” 

+  :should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!) 

−  import Parser (Parser, item) 
+  === Table for small legths === 
−  import CL (CL, k, s, apply) 
+  { border="1" cellspacing="0" cellpadding="5" align="center" 
−  import CLExt ((>>^)) 
+  ! Length (<math>n</math>) 
−  import PreludeExt (bool) 
+  ! All strings (<math>2^n</math>) 
+  ! Decodable strings, ratio, their sum till now 

+  ! Terminating, ratio, their sum till now 

+  ! <math>\Omega</math> approximated till now: mantissa  binary, lengthfitting binary, decimal 

+   

+   0 

+   1 

+   0, 0, 0 

+   0, 0, 0 

+   , ,  

+   

+   1 

+   2 

+   0, 0, 0 

+   0, 0, 0 

+   , 0, 0 

+   

+   2 

+   4 

+   2, <math>\frac12</math>, <math>\frac12</math> 

+   2, <math>\frac12</math>, <math>\frac12</math> 

+   1, 10, 5 

+   

+   3 

+   8 

+   0, 0, <math>\frac12</math> 

+   0, 0, <math>\frac12</math> 

+   1, 100, 5 

+   

+   4 

+   16 

+   0, 0, <math>\frac12</math> 

+   0, 0, <math>\frac12</math> 

+   1, 1000, 5 

+   

+   5 

+   32 

+   4, <math>\frac18</math>, <math>\frac58</math> 

+   4, <math>\frac18</math>, <math>\frac58</math> 

+   101, 10100, 625 

+  } 

+  It illustrates nicely, that Chaitin's construction is a [http://en.wikipedia.org/wiki/Normal_number normal number], as if its digits (in binary representation) were generated by tossing a coin. 

−  clP :: Parser Bool CL 
+  == Eliminating any concept of code by handling [[combinatory logic]] terms directly == 
−  clP = item (bool applicationP baseP) 

−  applicationP :: Parser Bool CL 
+  Chaitin's construction can be grasped also as 
−  applicationP = clP >>^ clP 
+  :<math>\sum_{p\in \mathrm{CL},\;\mathrm{hnf}\;p} 2^{\left\mathrm{dc}^{1}\;p\right}</math> 
−  baseP :: Parser Bool CL 
+  We can avoid referring to any code notion, if we modularize out function 
−  baseP = item (bool k s) 
+  :<math>\left\cdot\right\circ\mathrm{dc}^{1}</math> 
+  and give it a separate name, e.g. 

+  :<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math> 

+  and notice that it can be defined directly in terms of CLterms (we need not use any decoding concept any longer): 

−  kP, sP :: Parser Bool CL 
+  :<math>\left\Vert\mathbf K\right\Vert = 2</math> 
−  kP = return k 
+  :<math>\left\Vert\mathbf S\right\Vert = 2</math> 
−  sP = return s 
+  :<math>\left\Vert\left(x\;y\right)\right\Vert = 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math> 
−  </haskell> 

−  ==== Combinatory logic term modules ==== 
+  Thus, we transfer (lift) the notion of “length” from bit sequences to [[combinatory logic]] terms in an appropriate way. Let us call it, e.g. the “norm” of the term. 
−  ===== CL ===== 
+  Thus, Chaitin's construction is grasped also as 
+  :<math>\sum_{p \in \mathrm{Dom}_{\mathrm{nf}}} 2^{\left\Vert p\right\Vert}</math> 

+  where 

+  :<math>\mathrm{nf} : \mathrm{CL} \supset\!\to \mathrm{CL}</math> 

+  is a partial function defined on CL terms, it attributes to each "terminating" term its normal form. 

−  <haskell> 
+  Thus, we have no notions of “bit sequence”,“code”, “coding”, “decoding” at all. But their ghosts still haunt us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding. 
−  module CL (CL, k, s, apply) where 

−  import Tree (Tree (Leaf, Branch)) 
+  === More natural norm functions (from CL terms) === 
−  import BaseSymbol (BaseSymbol, kay, ess) 

−  type CL = Tree BaseSymbol 
+  Question: 
+  If we already move away from the approaches referring to any code concept, then 

+  could we define norm in other ways? E.g. 

+  :<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math> 

+  :<math>\left\Vert\mathbf K\right\Vert = 1</math> 

+  :<math>\left\Vert\mathbf S\right\Vert = 1</math> 

+  :<math>\left\Vert\left(x\;y\right)\right\Vert = 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math> 

+  And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (based on analysis, arithmetic and probability theory). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds. 

−  k, s :: CL 
+  What I really want is to exclude conceptually the notion of coding, and with it the notion of “syntactically incorrect versus syntactically correct but diverging”. Thus, taking into account only syntactically correct things, seeing only the choice of terminating versus nonterminating. Thus taking only termination vs nontermination into account, when calculating Chaitin's construction. 
−  k = Leaf kay 

−  s = Leaf ess 

−  apply :: CL > CL > CL 
+  What I want to preserve: 
−  apply = Branch 
+  * it can be interpreted as a probability 
−  </haskell> 
+  * it is a [http://en.wikipedia.org/wiki/Normal_number normal number], as if its digits (in binary representation) were generated by tossing a coin 
+  thus I do not want to spoil these features. 

−  ===== CL extension ===== 
+  ==== Table for simpler CLterms ==== 
+  Let us not take into account coding and thus excluding the notion of “syntactically incorrect coding” even ''conceptually''. 

+  Can we guess a good norm? 

+  { border="1" cellspacing="0" cellpadding="5" align="center" 

+  ! Binary tree pattern 

+  ! Maximal depth, vertices, edges 

+  ! Leafs, branches 

+  ! So many CLterms = how to count it 

+  ! Terminating, ratio 

+  ! So many till now, ratio till now 

+   

+   <math>\cdot</math> 

+   0, 1, 0 

+   1, 0 

+   <math>2 = 2</math> 

+   2, 1 

+   2, 1 

+   

+   <math>\left(\right)</math> 

+   1, 3, 2 

+   2, 1 

+   <math>4 = 2\cdot2</math> 

+   4, 1 

+   6, 1 

+   

+   <math>\cdot\left(\right)</math> 

+   2, 5, 4 

+   3, 2 

+   <math>8 = 2\cdot2^2</math> 

+   8, 1 

+   14, 1 

+   

+   <math>\left(\right)\cdot</math> 

+   2, 5, 4 

+   3, 2 

+   <math>8 = 2^2\cdot2</math> 

+   8, 1 

+   22, 1 

+   

+   <math>\left(\right)\left(\right)</math> 

+   2, 7, 6 

+   4, 3 

+   <math>16 = 2^2\cdot2^2</math> 

+   16, 1 

+   38, 1 

+  } 

−  <haskell> 
+  == Implementation == 
−  module CLExt ((>>^)) where 

−  import CL (CL, apply) 
+  To do: 
−  import Control.Monad (Monad, liftM2) 
+  Writing a program in Haskell  or in [[combinatory logic]]:)  which could help in making conjectures on [[combinatory logic]]based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work. The analogy: 
+  * they ask for a maximum limit of iterations, so that they can make a conjecture on convergence of a series; 

+  * this program will ask for the maximum limit of reducton steps, so that it can make a conjecture on termination (havingnormalform) of a CL term. 

+  Explanation for this: nontermination of each actually examined CLterm cannot be proven by the program, but a good conjecture can be made: if termination does not take place in the given limit of reduction steps, then the actually examined CLterm is regarded as nonterminating. 

−  (>>^) :: Monad m => m CL > m CL > m CL 
+  === Architecture === 
−  (>>^) = liftM2 apply 

−  </haskell> 

−  ===== Base symbol ===== 
+  A CL term generator generates CL terms in “ascending order” (in terms of a theoretically appropriate “norm”), and by computing the norm of each CLterm, it approximates Chaitin's construction (at a given number of digits, and according to the given maximal limit of reduction steps). 
−  <haskell> 
+  === User interface === 
−  module BaseSymbol (BaseSymbol, kay, ess) where 

−  data BaseSymbol = K  S 
+  chaitin modelofcomputation=cl encoding=tromp limitofreductionsteps=500 digits=9 decimal 
+  chaitin modelofcomputation=cl encoding=direct limitofreductionsteps=500 digits=9 decimal 

−  kay, ess :: BaseSymbol 
+  === Term generator === 
−  kay = K 

−  ess = S 

−  </haskell> 

−  ==== Utility modules ==== 
+  <haskell> 
+  module CLGen where 

−  ===== Binary tree ===== 
+  import Generator (gen0) 
−  +  import CL (k, s, apply) 

−  <haskell> 

−  module Tree (Tree (Leaf, Branch)) where 

−  data Tree a = Leaf a  Branch (Tree a) (Tree a) 
+  direct :: [CL] 
+  direct = gen0 apply [s, k] 

</haskell> 
</haskell> 

−  ===== Parser ===== 
+  See [[/Combinatory logiccombinatory logic term modules here]]. 
<haskell> 
<haskell> 

−  module Parser (Parser, item) where 
+  module Generator (gen0) where 
−  import Control.Monad.State (StateT, get, put) 
+  import PreludeExt (cross) 
−  type Parser token a = StateT [token] [] a 
+  gen0 :: (a > a > a) > [a] > [a] 
+  gen0 f c = gen f c 0 

−  item :: Parser a 
+  gen :: (a > a > a) > [a] > Integer > [a] 
−  item = do 
+  gen f c n = sizedGen f c n ++ gen f c (succ n) 
−  token : tokens < get 
+  
−  put tokens 
+  sizedGen :: (a > a > a) > [a] > Integer > [a] 
−  return token 
+  sizedGen f c 0 = c 
+  sizedGen f c (n + 1) = map (uncurry f) 

+  $ 

+  concat [sizedGen f c i `cross` sizedGen f c (n  i)  i < [0..n]] 

</haskell> 
</haskell> 

−  
−  ===== Prelude extension ===== 

<haskell> 
<haskell> 

−  module PreludeExt (bool) where 
+  module PreludeExt (cross) where 
−  bool :: a > a > Bool > a 
+  cross :: [a] > [a] > [(a, a)] 
−  bool thenC elseC t = if t then thenC else elseC 
+  cross xs ys = [(x, y)  x < xs, y < ys] 
−  </haskell> 

−  
−  === Partial function approach === 

−  
−  Now, Chaitin's construction will be here 

−  :<math>\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{\leftp\right}</math> 

−  where 

−  ;<math>\mathrm{hnf}</math> 

−  :should denote an unary predicate “has normal form” (“terminates”) 

−  ;<math>\mathrm{dc}</math> 

−  :should mean an operator “decode” (a function from finite bit sequences to [[combinatory logic]] terms) 

−  ;<math>2\!\;^{*}</math> 

−  :should denote the set of all finite bit sequences 

−  ;<math>\mathrm{Dom}_\mathrm{dc}</math> 

−  :should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, <math>\left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }</math> 

−  ;“Absolute value” 

−  :should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!) 

−  
−  === Total function approach === 

−  
−  Here, <math>\mathrm{dc}</math> is a partial function (from finite bit sequences). If this is confusing or annoying, then we can choose a more Haskelllike approach, making <math>\mathrm{dc}</math> a total function: 

−  <haskell> 

−  dc :: [Bit] > Maybe CL 

</haskell> 
</haskell> 

−  then, Chaitin's construction will be 

−  :<math>\sum_{p\in 2^*,\;\mathrm{maybe}\;\downarrow\;\mathrm{hnf}\;\left(\mathrm{dc}\;p\right)} 2^{\leftp\right}</math> 

−  where <math>\downarrow</math> should denote false truth value. 

== Related concepts == 
== Related concepts == 

Line 129:  Line 126:  
== To do == 
== To do == 

−  Writing a program in Haskell  or in [[combinatory logic]]:)  which could help in making conjectures on [[combinatory logic]]based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work: it would ask for a maximum limit of iterations. 
+  * Making tasks described in [[#Implementation]] 
−  chaitin computation=cl coding=tromp limitofiterations=5000 digits=10 decimal 
+  * Making more natural norm functions (from CLterms), see [[#More natural norm functions (from CL terms)]] 
Latest revision as of 21:34, 14 March 2009
 Correction in process. There is a substantial point that is lacking yet, the formulae and the concepts are not correct without it.
Contents 
[edit] 1 Introduction
Are there any real numbers which are defined exactly, but cannot be computed? This question leads us to exact real arithmetic, and algorithmic information theory, and foundations of mathematics and computer science.
See Wikipedia article on Chaitin's construction, referring to e.g.
 Computing a Glimpse of Randomness (written by Cristian S. Calude, Michael J. Dinneen, and ChiKou Shu)
 Omega and why math has no TOEs (Gregory Chaitin).
[edit] 2 Basing it on combinatory logic
Some more direct relatedness to functional programming: we can base Ω on combinatory logic (instead of a Turing machine).
[edit] 2.1 Coding
See the prefix coding system described in Binary Lambda Calculus and Combinatory Logic (page 20) written by John Tromp:
of course, c, d are metavariables, and also some other notations are changed slightly.
[edit] 2.2 Decoding
Having seen this, decoding is rather straightforward. Here is a parser for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task.
[edit] 2.3 Chaitin's construction
Now, Chaitin's construction will be here
where
 hnf
 should denote an unary predicate “has normal form” (“terminates”)
 dc
 should mean an operator “decode” (a function from finite bit sequences to combinatory logic terms)
 should denote the set of all finite bit sequences
 Dom_{dc}
 should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus,
 “Absolute value”
 should mean the length of a bit sequence (not combinatory logic term evaluation!)
[edit] 2.4 Table for small legths
Length (n)  All strings (2^{n})  Decodable strings, ratio, their sum till now  Terminating, ratio, their sum till now  Ω approximated till now: mantissa  binary, lengthfitting binary, decimal 

0  1  0, 0, 0  0, 0, 0  , ,  
1  2  0, 0, 0  0, 0, 0  , 0, 0 
2  4  2, ,  2, ,  1, 10, 5 
3  8  0, 0,  0, 0,  1, 100, 5 
4  16  0, 0,  0, 0,  1, 1000, 5 
5  32  4, ,  4, ,  101, 10100, 625 
It illustrates nicely, that Chaitin's construction is a normal number, as if its digits (in binary representation) were generated by tossing a coin.
[edit] 3 Eliminating any concept of code by handling combinatory logic terms directly
Chaitin's construction can be grasped also as
We can avoid referring to any code notion, if we modularize out function
and give it a separate name, e.g.
and notice that it can be defined directly in terms of CLterms (we need not use any decoding concept any longer):
Thus, we transfer (lift) the notion of “length” from bit sequences to combinatory logic terms in an appropriate way. Let us call it, e.g. the “norm” of the term.
Thus, Chaitin's construction is grasped also as
where
is a partial function defined on CL terms, it attributes to each "terminating" term its normal form.
Thus, we have no notions of “bit sequence”,“code”, “coding”, “decoding” at all. But their ghosts still haunt us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.
[edit] 3.1 More natural norm functions (from CL terms)
Question: If we already move away from the approaches referring to any code concept, then could we define norm in other ways? E.g.
And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (based on analysis, arithmetic and probability theory). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.
What I really want is to exclude conceptually the notion of coding, and with it the notion of “syntactically incorrect versus syntactically correct but diverging”. Thus, taking into account only syntactically correct things, seeing only the choice of terminating versus nonterminating. Thus taking only termination vs nontermination into account, when calculating Chaitin's construction.
What I want to preserve:
 it can be interpreted as a probability
 it is a normal number, as if its digits (in binary representation) were generated by tossing a coin
thus I do not want to spoil these features.
[edit] 3.1.1 Table for simpler CLterms
Let us not take into account coding and thus excluding the notion of “syntactically incorrect coding” even conceptually. Can we guess a good norm?
Binary tree pattern  Maximal depth, vertices, edges  Leafs, branches  So many CLterms = how to count it  Terminating, ratio  So many till now, ratio till now 

0, 1, 0  1, 0  2 = 2  2, 1  2, 1  
1, 3, 2  2, 1  4, 1  6, 1  
2, 5, 4  3, 2  8, 1  14, 1  
2, 5, 4  3, 2  8, 1  22, 1  
2, 7, 6  4, 3  16, 1  38, 1 
[edit] 4 Implementation
To do: Writing a program in Haskell  or in combinatory logic:)  which could help in making conjectures on combinatory logicbased Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work. The analogy:
 they ask for a maximum limit of iterations, so that they can make a conjecture on convergence of a series;
 this program will ask for the maximum limit of reducton steps, so that it can make a conjecture on termination (havingnormalform) of a CL term.
Explanation for this: nontermination of each actually examined CLterm cannot be proven by the program, but a good conjecture can be made: if termination does not take place in the given limit of reduction steps, then the actually examined CLterm is regarded as nonterminating.
[edit] 4.1 Architecture
A CL term generator generates CL terms in “ascending order” (in terms of a theoretically appropriate “norm”), and by computing the norm of each CLterm, it approximates Chaitin's construction (at a given number of digits, and according to the given maximal limit of reduction steps).
[edit] 4.2 User interface
chaitin modelofcomputation=cl encoding=tromp limitofreductionsteps=500 digits=9 decimal chaitin modelofcomputation=cl encoding=direct limitofreductionsteps=500 digits=9 decimal
[edit] 4.3 Term generator
module CLGen where import Generator (gen0) import CL (k, s, apply) direct :: [CL] direct = gen0 apply [s, k]
See combinatory logic term modules here.
module Generator (gen0) where import PreludeExt (cross) gen0 :: (a > a > a) > [a] > [a] gen0 f c = gen f c 0 gen :: (a > a > a) > [a] > Integer > [a] gen f c n = sizedGen f c n ++ gen f c (succ n) sizedGen :: (a > a > a) > [a] > Integer > [a] sizedGen f c 0 = c sizedGen f c (n + 1) = map (uncurry f) $ concat [sizedGen f c i `cross` sizedGen f c (n  i)  i < [0..n]]
module PreludeExt (cross) where cross :: [a] > [a] > [(a, a)] cross xs ys = [(x, y)  x < xs, y < ys]
[edit] 5 Related concepts
[edit] 6 To do
 Making tasks described in #Implementation
 Making more natural norm functions (from CLterms), see #More natural norm functions (from CL terms)