Difference between revisions of "Chaitin's construction"

From HaskellWiki
Jump to navigation Jump to search
(``neverfailing'' is a better name for the combinator on parsers, which makes ``decode'' function total)
m (Being corrected slowly...)
 
(30 intermediate revisions by one other user not shown)
Line 1: Line 1:
  +
:'''''Correction in process. There is a substantial point that is lacking yet, the formulae and the concepts are not correct without it.'''''
  +
 
__TOC__
 
__TOC__
   
Line 4: Line 6:
   
 
Are there any real numbers which are defined exactly, but cannot be computed?
 
Are there any real numbers which are defined exactly, but cannot be computed?
This question leads us to [[exact real arithmetic]], foundations of [[mathematics]] and [[computer science]].
+
This question leads us to [[exact real arithmetic]], and [[algorithmic information theory]], and foundations of [[mathematics]] and [[computer science]].
   
 
See Wikipedia article on [http://en.wikipedia.org/wiki/Chaitin%27s_constant Chaitin's construction], referring to e.g.
 
See Wikipedia article on [http://en.wikipedia.org/wiki/Chaitin%27s_constant Chaitin's construction], referring to e.g.
Line 25: Line 27:
   
 
Having seen this, decoding is rather straightforward.
 
Having seen this, decoding is rather straightforward.
  +
[[/Parser|Here is a parser]] for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task.
Let us describe the seen language with a LL(1) grammar, and let us make use of the lack of backtracking, lack of look-ahead, when deciding which parser approach to use.
 
   
  +
=== Chaitin's construction ===
Some notes about the used parser library: I shall use the didactical approach read in paper [http://www.cs.nott.ac.uk/~gmh/bib.html#monparsing Monadic Parser Combinators] (written by [http://www.cs.nott.ac.uk/Department/Staff/gmh/ Graham Hutton] and Erik Meier). The optimalisations described in the paper are avoided here. Of course, we can make optimalisations, or choose sophisticated parser libraries (Parsec, arrow parsers). A pro for this simpler parser: it may be easier to augment it with other monad transformers. But, I think, the task does not require such ability. So the real pro for it is that it looks more didactical for me. Of couse, it may be inefficient at many other tasks, but I hope, the LL(1) grammar will not raise huge problems.
 
   
  +
Now, Chaitin's construction will be here
==== Decoding module ====
 
  +
:<math>\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{-\left|p\right|}</math>
  +
where
  +
;<math>\mathrm{hnf}</math>
  +
:should denote an unary predicate “has normal form” (“terminates”)
  +
;<math>\mathrm{dc}</math>
  +
:should mean an operator “decode” (a function from finite bit sequences to [[combinatory logic]] terms)
  +
;<math>2\!\;^{*}</math>
  +
:should denote the set of all finite bit sequences
  +
;<math>\mathrm{Dom}_\mathrm{dc}</math>
  +
:should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, <math>\left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }</math>
  +
;“Absolute value”
  +
:should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!)
   
  +
=== Table for small legths ===
<haskell>
 
  +
{| border="1" cellspacing="0" cellpadding="5" align="center"
module Decode (clP) where
 
  +
! Length (<math>n</math>)
  +
! All strings (<math>2^n</math>)
  +
! Decodable strings, ratio, their sum till now
  +
! Terminating, ratio, their sum till now
  +
! <math>\Omega</math> approximated till now: mantissa -- binary, length-fitting binary, decimal
  +
|-
  +
| 0
  +
| 1
  +
| 0, 0, 0
  +
| 0, 0, 0
  +
| -, -, -
  +
|-
  +
| 1
  +
| 2
  +
| 0, 0, 0
  +
| 0, 0, 0
  +
| -, 0, 0
  +
|-
  +
| 2
  +
| 4
  +
| 2, <math>\frac12</math>, <math>\frac12</math>
  +
| 2, <math>\frac12</math>, <math>\frac12</math>
  +
| 1, 10, 5
  +
|-
  +
| 3
  +
| 8
  +
| 0, 0, <math>\frac12</math>
  +
| 0, 0, <math>\frac12</math>
  +
| 1, 100, 5
  +
|-
  +
| 4
  +
| 16
  +
| 0, 0, <math>\frac12</math>
  +
| 0, 0, <math>\frac12</math>
  +
| 1, 1000, 5
  +
|-
  +
| 5
  +
| 32
  +
| 4, <math>\frac18</math>, <math>\frac58</math>
  +
| 4, <math>\frac18</math>, <math>\frac58</math>
  +
| 101, 10100, 625
  +
|}
  +
It illustrates nicely, that Chaitin's construction is a [http://en.wikipedia.org/wiki/Normal_number normal number], as if its digits (in binary representation) were generated by tossing a coin.
   
  +
== Eliminating any concept of code by handling [[combinatory logic]] terms directly ==
import Parser (Parser, item)
 
import CL (CL, k, s, apply)
 
import CLExt ((>>@))
 
import PreludeExt (bool)
 
   
  +
Chaitin's construction can be grasped also as
clP :: Parser Bool CL
 
  +
:<math>\sum_{p\in \mathrm{CL},\;\mathrm{hnf}\;p} 2^{-\left|\mathrm{dc}^{-1}\;p\right|}</math>
clP = item >>= bool applicationP baseP
 
   
  +
We can avoid referring to any code notion, if we modularize out function
applicationP :: Parser Bool CL
 
  +
:<math>\left|\cdot\right|\circ\mathrm{dc}^{-1}</math>
applicationP = clP >>@ clP
 
  +
and give it a separate name, e.g.
  +
:<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math>
  +
and notice that it can be defined directly in terms of CL-terms (we need not use any decoding concept any longer):
   
  +
:<math>\left\Vert\mathbf K\right\Vert = 2</math>
baseP :: Parser Bool CL
 
  +
:<math>\left\Vert\mathbf S\right\Vert = 2</math>
baseP = item >>= bool k s
 
  +
:<math>\left\Vert\left(x\;y\right)\right\Vert = 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math>
   
  +
Thus, we transfer (lift) the notion of “length” from bit sequences to [[combinatory logic]] terms in an appropriate way. Let us call it, e.g. the “norm” of the term.
kP, sP :: Parser Bool CL
 
kP = return k
 
sP = return s
 
</haskell>
 
   
  +
Thus, Chaitin's construction is grasped also as
==== Combinatory logic term modules ====
 
  +
:<math>\sum_{p \in \mathrm{Dom}_{\mathrm{nf}}} 2^{-\left\Vert p\right\Vert}</math>
  +
where
  +
:<math>\mathrm{nf} : \mathrm{CL} \supset\!\to \mathrm{CL}</math>
  +
is a partial function defined on CL terms, it attributes to each "terminating" term its normal form.
   
  +
Thus, we have no notions of “bit sequence”,“code”, “coding”, “decoding” at all. But their ghosts still haunt us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.
===== CL =====
 
   
  +
=== More natural norm functions (from CL terms) ===
<haskell>
 
module CL (CL, k, s, apply) where
 
   
  +
Question:
import Tree (Tree (Leaf, Branch))
 
  +
If we already move away from the approaches referring to any code concept, then
import BaseSymbol (BaseSymbol, kay, ess)
 
  +
could we define norm in other ways? E.g.
  +
:<math>\left\Vert\cdot\right\Vert : \mathrm{CL}\to\mathbb N</math>
  +
:<math>\left\Vert\mathbf K\right\Vert = 1</math>
  +
:<math>\left\Vert\mathbf S\right\Vert = 1</math>
  +
:<math>\left\Vert\left(x\;y\right)\right\Vert = 1 + \left\Vert x\right\Vert + \left\Vert y\right\Vert</math>
  +
And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (based on analysis, arithmetic and probability theory). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.
   
  +
What I really want is to exclude conceptually the notion of coding, and with it the notion of “syntactically incorrect versus syntactically correct but diverging”. Thus, taking into account only syntactically correct things, seeing only the choice of terminating versus non-terminating. Thus taking only termination vs nontermination into account, when calculating Chaitin's construction.
type CL = Tree BaseSymbol
 
   
  +
What I want to preserve:
k, s :: CL
 
  +
* it can be interpreted as a probability
k = Leaf kay
 
  +
* it is a [http://en.wikipedia.org/wiki/Normal_number normal number], as if its digits (in binary representation) were generated by tossing a coin
s = Leaf ess
 
  +
thus I do not want to spoil these features.
   
apply :: CL -> CL -> CL
+
==== Table for simpler CL-terms ====
  +
Let us not take into account coding and thus excluding the notion of “syntactically incorrect coding” even ''conceptually''.
apply = Branch
 
  +
Can we guess a good norm?
</haskell>
 
  +
{| border="1" cellspacing="0" cellpadding="5" align="center"
  +
! Binary tree pattern
  +
! Maximal depth, vertices, edges
  +
! Leafs, branches
  +
! So many CL-terms = how to count it
  +
! Terminating, ratio
  +
! So many till now, ratio till now
  +
|-
  +
| <math>\cdot</math>
  +
| 0, 1, 0
  +
| 1, 0
  +
| <math>2 = 2</math>
  +
| 2, 1
  +
| 2, 1
  +
|-
  +
| <math>\left(\right)</math>
  +
| 1, 3, 2
  +
| 2, 1
  +
| <math>4 = 2\cdot2</math>
  +
| 4, 1
  +
| 6, 1
  +
|-
  +
| <math>\cdot\left(\right)</math>
  +
| 2, 5, 4
  +
| 3, 2
  +
| <math>8 = 2\cdot2^2</math>
  +
| 8, 1
  +
| 14, 1
  +
|-
  +
| <math>\left(\right)\cdot</math>
  +
| 2, 5, 4
  +
| 3, 2
  +
| <math>8 = 2^2\cdot2</math>
  +
| 8, 1
  +
| 22, 1
  +
|-
  +
| <math>\left(\right)\left(\right)</math>
  +
| 2, 7, 6
  +
| 4, 3
  +
| <math>16 = 2^2\cdot2^2</math>
  +
| 16, 1
  +
| 38, 1
  +
|}
   
  +
== Implementation ==
===== CL extension =====
 
   
  +
To do:
<haskell>
 
  +
Writing a program in Haskell -- or in [[combinatory logic]]:-) -- which could help in making conjectures on [[combinatory logic]]-based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work. The analogy:
module CLExt ((>>@)) where
 
  +
* they ask for a maximum limit of iterations, so that they can make a conjecture on convergence of a series;
  +
* this program will ask for the maximum limit of reducton steps, so that it can make a conjecture on termination (having-normal-form) of a CL term.
  +
Explanation for this: non-termination of each actually examined CL-term cannot be proven by the program, but a good conjecture can be made: if termination does not take place in the given limit of reduction steps, then the actually examined CL-term is regarded as non-terminating.
   
  +
=== Architecture ===
import CL (CL, apply)
 
import Control.Monad (Monad, liftM2)
 
   
  +
A CL term generator generates CL terms in “ascending order” (in terms of a theoretically appropriate “norm”), and by computing the norm of each CL-term, it approximates Chaitin's construction (at a given number of digits, and according to the given maximal limit of reduction steps).
(>>@) :: Monad m => m CL -> m CL -> m CL
 
(>>@) = liftM2 apply
 
</haskell>
 
   
===== Base symbol =====
+
=== User interface ===
   
  +
chaitin --model-of-computation=cl --encoding=tromp --limit-of-reduction-steps=500 --digits=9 --decimal
<haskell>
 
  +
chaitin --model-of-computation=cl --encoding=direct --limit-of-reduction-steps=500 --digits=9 --decimal
module BaseSymbol (BaseSymbol, kay, ess) where
 
   
  +
=== Term generator ===
data BaseSymbol = K | S
 
   
  +
<haskell>
kay, ess :: BaseSymbol
 
  +
module CLGen where
kay = K
 
ess = S
 
</haskell>
 
   
  +
import Generator (gen0)
==== Utility modules ====
 
  +
import CL (k, s, apply)
 
===== Binary tree =====
 
 
<haskell>
 
module Tree (Tree (Leaf, Branch)) where
 
   
  +
direct :: [CL]
data Tree a = Leaf a | Branch (Tree a) (Tree a)
 
  +
direct = gen0 apply [s, k]
 
</haskell>
 
</haskell>
   
  +
See [[/Combinatory logic|combinatory logic term modules here]].
===== Parser =====
 
   
 
<haskell>
 
<haskell>
module Parser (Parser, runParser, item) where
+
module Generator (gen0) where
   
  +
import PreludeExt (cross)
import Control.Monad.State (StateT, runStateT, get, put)
 
   
type Parser token a = StateT [token] [] a
+
gen0 :: (a -> a -> a) -> [a] -> [a]
  +
gen0 f c = gen f c 0
   
runParser :: Parser token a -> [token] -> [(a, [token])]
+
gen :: (a -> a -> a) -> [a] -> Integer -> [a]
  +
gen f c n = sizedGen f c n ++ gen f c (succ n)
runParser = runStateT
 
   
  +
sizedGen :: (a -> a -> a) -> [a] -> Integer -> [a]
item :: Parser token token
 
  +
sizedGen f c 0 = c
item = do
 
  +
sizedGen f c (n + 1) = map (uncurry f)
token : tokens <- get
 
  +
$
put tokens
 
  +
concat [sizedGen f c i `cross` sizedGen f c (n - i) | i <- [0..n]]
return token
 
 
</haskell>
 
</haskell>
 
===== Prelude extension =====
 
   
 
<haskell>
 
<haskell>
module PreludeExt (bool) where
+
module PreludeExt (cross) where
   
bool :: a -> a -> Bool -> a
+
cross :: [a] -> [a] -> [(a, a)]
  +
cross xs ys = [(x, y) | x <- xs, y <- ys]
bool thenC elseC t = if t then thenC else elseC
 
 
</haskell>
 
</haskell>
 
=== Approach based on decoding with partial function ===
 
 
Now, Chaitin's construction will be here
 
:<math>\sum_{p\in \mathrm{Dom}_\mathrm{dc},\;\mathrm{hnf}\left(\mathrm{dc}\;p\right)} 2^{-\left|p\right|}</math>
 
where
 
;<math>\mathrm{hnf}</math>
 
:should denote an unary predicate “has normal form” (“terminates”)
 
;<math>\mathrm{dc}</math>
 
:should mean an operator “decode” (a function from finite bit sequences to [[combinatory logic]] terms)
 
;<math>2\!\;^{*}</math>
 
:should denote the set of all finite bit sequences
 
;<math>\mathrm{Dom}_\mathrm{dc}</math>
 
:should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus, <math>\left\{00, 01, 1\;00\;00, 1\;00\;01, 1\;01\;00, 1\;01\;01, \dots\right\} = \mathrm{Dom}_{\mathrm{dc}} = \mathrm{Rng}_{\widehat\ }</math>
 
;“Absolute value”
 
:should mean the length of a bit sequence (not [[combinatory logic]] term evaluation!)
 
 
=== Approach based on decoding with total function ===
 
 
Seen above, <math>\mathrm{dc}</math> was a partial function (from finite bit sequences). We can implement it e.g. as
 
<haskell>
 
dc :: [Bit] -> CL
 
dc = fst . head . runParser clP
 
</haskell>
 
where the use of <hask>head</hask> reveals that it is a partial function (of course, because not every bit sequence is a correct coding of a CL-term).
 
 
If this is confusing or annoying, then we can choose a more Haskell-like approach, making <math>\mathrm{dc}</math> a total function:
 
<haskell>
 
dc :: [Bit] -> Maybe CL
 
dc = fst . head . runParser (neverfailing clP)
 
</haskell>
 
where
 
<haskell>
 
neverfailing :: MonadPlus m => m a -> m (Maybe a)
 
neverfailing p = liftM Just p `mplus` return Nothing
 
</haskell>
 
then, Chaitin's construction will be
 
:<math>\sum_{p\in 2^*,\;\mathrm{maybe}\;\downarrow\;\mathrm{hnf}\;\left(\mathrm{dc}\;p\right)} 2^{-\left|p\right|}</math>
 
where <math>\downarrow</math> should denote false truth value.
 
   
 
== Related concepts ==
 
== Related concepts ==
Line 185: Line 248:
 
== To do ==
 
== To do ==
   
  +
* Making tasks described in [[#Implementation]]
Writing a program in Haskell -- or in [[combinatory logic]]:-) -- which could help in making conjectures on [[combinatory logic]]-based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work: it would ask for a maximum limit of iterations.
 
  +
* Making more natural norm functions (from CL-terms), see [[#More natural norm functions (from CL terms)]]
chaitin --computation=cl --coding=tromp --limit-of-iterations=5000 --digits=10 --decimal
 
  +
  +
[[Category:Pages under construction]]

Latest revision as of 00:43, 26 April 2021

Correction in process. There is a substantial point that is lacking yet, the formulae and the concepts are not correct without it.

Introduction

Are there any real numbers which are defined exactly, but cannot be computed? This question leads us to exact real arithmetic, and algorithmic information theory, and foundations of mathematics and computer science.

See Wikipedia article on Chaitin's construction, referring to e.g.

Basing it on combinatory logic

Some more direct relatedness to functional programming: we can base on combinatory logic (instead of a Turing machine).

Coding

See the prefix coding system described in Binary Lambda Calculus and Combinatory Logic (page 20) written by John Tromp:

of course, , are meta-variables, and also some other notations are changed slightly.

Decoding

Having seen this, decoding is rather straightforward. Here is a parser for illustration, but it serves only didactical purposes: it will not be used in the final implementation, because a good term generator makes parsing superfluous at this task.

Chaitin's construction

Now, Chaitin's construction will be here

where

should denote an unary predicate “has normal form” (“terminates”)
should mean an operator “decode” (a function from finite bit sequences to combinatory logic terms)
should denote the set of all finite bit sequences
should denote the set of syntactically correct bit sequences (semantically, they may either terminate or diverge), i.e. the domain of the decoding function, i.e. the range of the coding function. Thus,
“Absolute value”
should mean the length of a bit sequence (not combinatory logic term evaluation!)

Table for small legths

Length () All strings () Decodable strings, ratio, their sum till now Terminating, ratio, their sum till now approximated till now: mantissa -- binary, length-fitting binary, decimal
0 1 0, 0, 0 0, 0, 0 -, -, -
1 2 0, 0, 0 0, 0, 0 -, 0, 0
2 4 2, , 2, , 1, 10, 5
3 8 0, 0, 0, 0, 1, 100, 5
4 16 0, 0, 0, 0, 1, 1000, 5
5 32 4, , 4, , 101, 10100, 625

It illustrates nicely, that Chaitin's construction is a normal number, as if its digits (in binary representation) were generated by tossing a coin.

Eliminating any concept of code by handling combinatory logic terms directly

Chaitin's construction can be grasped also as

We can avoid referring to any code notion, if we modularize out function

and give it a separate name, e.g.

and notice that it can be defined directly in terms of CL-terms (we need not use any decoding concept any longer):

Thus, we transfer (lift) the notion of “length” from bit sequences to combinatory logic terms in an appropriate way. Let us call it, e.g. the “norm” of the term.

Thus, Chaitin's construction is grasped also as

where

is a partial function defined on CL terms, it attributes to each "terminating" term its normal form.

Thus, we have no notions of “bit sequence”,“code”, “coding”, “decoding” at all. But their ghosts still haunt us: the definition of norm function looks rather strange without thinking on the fact that is was transferred from a concept of coding.

More natural norm functions (from CL terms)

Question: If we already move away from the approaches referring to any code concept, then could we define norm in other ways? E.g.

And is it worth doing it at all? The former one, at leat, had a good theoretical foundation (based on analysis, arithmetic and probability theory). This latter one is not so cleaner, that we should prefer it, so, lacking theoretical grounds.

What I really want is to exclude conceptually the notion of coding, and with it the notion of “syntactically incorrect versus syntactically correct but diverging”. Thus, taking into account only syntactically correct things, seeing only the choice of terminating versus non-terminating. Thus taking only termination vs nontermination into account, when calculating Chaitin's construction.

What I want to preserve:

  • it can be interpreted as a probability
  • it is a normal number, as if its digits (in binary representation) were generated by tossing a coin

thus I do not want to spoil these features.

Table for simpler CL-terms

Let us not take into account coding and thus excluding the notion of “syntactically incorrect coding” even conceptually. Can we guess a good norm?

Binary tree pattern Maximal depth, vertices, edges Leafs, branches So many CL-terms = how to count it Terminating, ratio So many till now, ratio till now
0, 1, 0 1, 0 2, 1 2, 1
1, 3, 2 2, 1 4, 1 6, 1
2, 5, 4 3, 2 8, 1 14, 1
2, 5, 4 3, 2 8, 1 22, 1
2, 7, 6 4, 3 16, 1 38, 1

Implementation

To do: Writing a program in Haskell -- or in combinatory logic:-) -- which could help in making conjectures on combinatory logic-based Chaitin's constructions. It would make only approximations, in a similar way that most Mandelbrot plotting softwares work. The analogy:

  • they ask for a maximum limit of iterations, so that they can make a conjecture on convergence of a series;
  • this program will ask for the maximum limit of reducton steps, so that it can make a conjecture on termination (having-normal-form) of a CL term.

Explanation for this: non-termination of each actually examined CL-term cannot be proven by the program, but a good conjecture can be made: if termination does not take place in the given limit of reduction steps, then the actually examined CL-term is regarded as non-terminating.

Architecture

A CL term generator generates CL terms in “ascending order” (in terms of a theoretically appropriate “norm”), and by computing the norm of each CL-term, it approximates Chaitin's construction (at a given number of digits, and according to the given maximal limit of reduction steps).

User interface

chaitin --model-of-computation=cl --encoding=tromp --limit-of-reduction-steps=500 --digits=9 --decimal
chaitin --model-of-computation=cl --encoding=direct --limit-of-reduction-steps=500 --digits=9 --decimal

Term generator

 module CLGen where

 import Generator (gen0)
 import CL (k, s, apply)

 direct :: [CL]
 direct = gen0 apply [s, k]

See combinatory logic term modules here.

 module Generator (gen0) where

 import PreludeExt (cross)

 gen0 :: (a -> a -> a) -> [a] -> [a]
 gen0 f c = gen f c 0

 gen :: (a -> a -> a) -> [a] -> Integer -> [a]
 gen f c n = sizedGen f c n ++ gen f c (succ n)

 sizedGen :: (a -> a -> a) -> [a] -> Integer -> [a]
 sizedGen f c 0 = c
 sizedGen f c (n + 1) = map (uncurry f)
                      $
                      concat [sizedGen f c i `cross` sizedGen f c (n - i) | i <- [0..n]]
 module PreludeExt (cross) where

 cross :: [a] -> [a] -> [(a, a)]
 cross xs ys = [(x, y) | x <- xs, y <- ys]

Related concepts

To do