AW: [Haskell-cafe] Something like optimistic evaluation

Nicu Ionita nionita at lycos.de
Tue Apr 29 15:22:41 EDT 2008


I don't know if this would be worth, but theoretically one could go on and
evaluate those thunks that:
 
a) would be evaluated anyway (after the current IO operation have been
completed)
b) do not depend on the result of the current operation
 
And, of course, the GC could work in this time also.
 
Nicu

-----Ursprüngliche Nachricht-----
Von: haskell-cafe-bounces at haskell.org
[mailto:haskell-cafe-bounces at haskell.org] Im Auftrag von Brent Yorgey
Gesendet: Dienstag, 29. April 2008 16:42
An: Daniil Elovkov
Cc: haskell-cafe at haskell.org
Betreff: Re: [Haskell-cafe] Something like optimistic evaluation



On Mon, Apr 28, 2008 at 6:09 PM, Daniil Elovkov
<daniil.elovkov at googlemail.com> wrote:


Hello

Somewhat on the topic of optimistic evaluation, I've just thought of another
way to evaluate thunks.

When the program is about to block on some IO, what if we start a thread to
evaluate (any) unevaluated thunks. We'll have additional system thread, but
the blocked one will not actually consume any processor time.

This would take place only when the program is compiled as threaded and run
with -N k, k>1.

The RTS knows at least about some operations that will block, those which IO
operations are implemented with. for example. It could merely start a
process of evaluating any (or something more clever) outstanding thunks
right before going into one of those operations and stop it when it's back.

Of course, it's not like optimistic evaluation because we don't avoid
creating thunks. But in a sense it's similar. It could also be compared with
incremental garbage collection :)

Has something like that been done, discussed?



This sounds like it could be helpful in certain circumstances, but in many
cases it could probably lead to unpredictable (and uncontrollable!) memory
usage.  I could imagine a situation where my program is running along just
fine, and then one day it takes a long time to do a read from the network
due to latency or whatever, and suddenly memory usage shoots through the
roof, due to evaluation of some infinite (or even just very large) data
structure.  

-Brent


-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20080429/7606a8c4/attachment.htm


More information about the Haskell-Cafe mailing list