Bang patterns, ~ patterns, and lazy let

John Hughes rjmh at
Tue Feb 7 16:44:24 EST 2006

>From: Ben Rudiak-Gould <Benjamin.Rudiak-Gould at>
>Subject: Re: Bang patterns, ~ patterns, and lazy let
>It's also not that case that !x has the same 
>meaning in both proposals, e.g.
>     let { !x = y ; !y = const 'a' x } in x
>means 'a' in the current proposal but _|_ in yours.
Aargh, you're right, it does mean _|_ in mine! That's not very nice.

But wait, I'm not sure about

    let { !x = const undefined y ; !y = const 'a' x } in y

    desugars in the current proposal to

    let { x = const undefined y ; y = const 'a' x } in x `seq` y `seq` y

    which is _|_, but absent implicit ~,

    let { x = const undefined y ; y = const 'a' x } in y

    had better (and does) mean 'a'. 

Applying the rules on the wiki, the first step is to translate the first 
expression into a tuple binding, omitting the implicit ~:

    let (x,y) = (const undefined y, const 'a' x) in y

This desugars to

    let (x,y) = fix (\ ~(x,y)->(const undefined y, const 'a' x)) in y

which evaluates to 'a'. In other words, despite the ! on x, the current 
proposal is not strict in x.

Maybe the intention was that !s on the lhs of let bindings should be 
transferred to the corresponding patterns when a tuple pattern is 
introduced? Let's try it: then the example desugars by pattern tupling to

    let (!x, !y) = (const undefined y, const 'a' x) in y

Now we can introduce fix:

    let (!x, !y) = fix (\ ~(!x, !y) -> (const undefined y, const 'a' x)) 
in y

and finally case:

    case fix (\~(!x,!y) -> (const undefined y, const 'a' x)) of ~(!x, 
!y) -> y

and this is consistent with what you said above.

But if I return to your first example, and do the same thing, I get

    let !x = y; !y = const 'a' x in x

desugars by tupling to

    let (!x, !y) = (y, const 'a' x) in x

which desugars by fix and case introduction to

    case fix (\ ~(!x, !y) -> (y, const 'a' x)) of ~(!x, !y) -> x

The first approximation to the fixed point is _|_, so the second is 
(_|_, 'a'). Now, when ~(!x,!y) is matched against (_|_,'a') then *both* 
variables are bound to _|_ --- the effect of the ~ is just to delay 
matching (!x,!y) until one of the variables is used, but as soon as y, 
say, *is* used, then the match is performed and, of course, it loops. 
Thus (_|_, 'a') is the fixed point. For the same reason, x and y are 
both bound to _|_ in the body of the case, and so the entire expression 
evaluates to _|_, not 'a' as you claimed.

Bottom line: I can't find a way to interpret the translation rules in 
the Haskell report, modified as the Wiki page suggests, to produce the 
results you expect in both cases.

But maybe the fault is in the translation rules in the Haskell report. 
It was always rather tricky to explain a group of recursive bindings in 
Haskell in terms of a single tuple binding, because Haskell tuples are 
lifted. I see that you have a more direct understanding of what ! is 
supposed to mean. Is it possible, I wonder, to give a direct 
denotational semantics to a declaration group--say mapping environments 
to environments--in which there is only one case for ! (its natural 
semantics in patterns)? Such a semantics should have the property that

    let x1 = e1; x2 = e2 in e0   ===   let x1 = e1 in let x2 = e2 in e0

provided x1 does not occur in e2. Finding a simple and compositional 
denotational semantics with these properties, and proving the law above, 
would be a good way to show that ! patterns do NOT introduce semantic 
warts---and would probably also suggest that the 
semantics-by-translation used in the report is fundamentally flawed. We 
did construct denotational semantics of fragments of Haskell as part of 
the original design, and it had quite an impact on the result--I 
recommend it as a way of debugging ideas!

-------------- next part --------------
An HTML attachment was scrubbed...

More information about the Haskell-prime mailing list