# [Haskell-cafe] Re: Web server continued

Jonathan Cast jonathanccast at fastmail.fm
Mon Dec 31 14:41:42 EST 2007

```On 31 Dec 2007, at 1:33 PM, Achim Schneider wrote:

> Jonathan Cast <jonathanccast at fastmail.fm> wrote:
>
>>
>> On 31 Dec 2007, at 10:43 AM, Achim Schneider wrote:
>>
>>> Achim Schneider <barsoap at web.de> wrote:
>>>
>>>> That's not specified though, the runtime could choose to let +
>>>> force the two chunks the different way round.
>>>>
>>> And that is probably also the reason why [1..] == [1..] is _|_.
>>>
>>> Is "Something that can be, in any evaluation strategy, be bottom, is
>>> bottom" quite right, i.e. the formalism defined such, that no
>>> possibly unevaluable thing is defined?
>>
>> No.  Again, the semantics of Haskell are defined denotationally, not
>> operationally.  In fact, Haskell implementations are required to use
>> an evaluation strategy that finds a value whenever one
>> (denotationally) exists, so it's the exact opposite of what you said.
>>
>> Strict languages come much closer to your rule, OTOH.
>>
> I guess I just have to change unevaluable to not denotationally
> reducable.

Well, defined as _|_.  Reduction isn't really a very good model for
denotational semantics at all (it's an operational concept).

Think of a recursively defined value as a limit:

let x = f x in x
= lim(_|_, f _|_, f (f _|_), ...)

and then other functions pass through that limit

g (let x = f x in x)
= g (lim(_|_, f _|_, f (f _|_), ...)
= lim(g _|_, g (f _|_), g (f (f _|_)), ...)

In that sense, a value is _|_ iff it cannot be proven not to be ---
iff any sequence it is a LUB (limit) of is everywhere _|_.  But
again, don't think operationally --- think in terms of LUBs of
increasing sequences.

jcc

```