[Haskell-cafe] Monads in Scala, XSLT, Unix shell pipes was Re: Monads in ...

Bill Wood william.wood3 at comcast.net
Sun Nov 27 17:36:57 EST 2005


(I'm going to do a lazy permute on your stream of consciousness; hope it
terminates :-).

I think the Rubicon here is the step from one to many -- one
function/procedure to many, one thread to many, one processor to
many, ... .  Our favorite pure functions are like the Hoare triples and
Dijkstra weakest preconditions of the formal methods folks in that the
latter abstract from the body of a procedure to the input-output
relation it computes; both the function and the abstracted procedure are
atomic and outside of time.  After all, aren't "referential
transparency" and "copy rule" all about replacing a function body with
its results?  Well, as soon as there are two or more
functions/procedures in the same environment, the prospect of
interaction and interference arises, and our nice, clean,
*comprehensible* atemporal semantics get replaced by temporal logic,
path expressions, trace specifications, what have you.  Some notion of
process is inevitable, since now each computation must be treated as an
activity over time in order to relate events that occur doing the
execution of one computation with the events of another.

Functional programming gives us the possibility of using algebra to
simplify the task of reasoning about single programs.  Of course,
non-functional procedures can also be reasoned about algebraically,
since a procedure P(args) that hammers on state can be adequately
described by a pure function f_P :: Args -> State -> State.  The problem
is, of course, that the state can be large.

But the functional paradigm offers some hope for containing the
complexity in the world of many as it does in the world of one. I think
combining formalisms like Hoare's CSP or Milner's CCS with computations
gives us the possibility of doing algebra on the temporal event
sequences corresponding to their interactions; the hope is that this is
simpler than doing proofs in dynamic or temporal logic.  Using
functional programs simplifies the algebraic task by reducing the size
of the set of events over which the algebra operates -- you consider
only the explicitly shared parameters and results, not the implicitly
shared memory that can couple non-functional procedures.

It is conceivable that you can get your compositionality here as well.
Suppose we package computations with input-output parameter
specifications and CSP-like specifications of the pattern of event
sequences produced when the computation executes.  It may be possible to
reason about the interactions of the event sequences of groups of
packages, determine the event sequences over non-hidden events provided
by the composite system, etc.

As far as Bulat's comment goes, I'm mostly in agreement.  My dataflow
view was really driven by the intuition that a functional program can be
described by a network of subfunctions linking outputs to inputs; cross
your eyes a little and voila!  A dataflow network.  And if we're smart
enough to make a compiler do that, why bother the programmer?  But
you're not talking about analyzing a function into a
parallel/concurrent/distributed implementation; rather, you're
interested in synthesizing a temporal process out of interacting
computations.

The temporal aspect won't go away.  And that's the problem.

 -- Bill Wood




More information about the Haskell-Cafe mailing list