[Haskell-cafe] Haskell maximum stack depth

Adrian Hey ahey at iee.org
Tue Jan 29 14:17:29 EST 2008


Jonathan Cast wrote:
> http://www.cs.princeton.edu/~appel/papers/45.ps is the traditional cite 
> here, no?

"Can be" is not the same as "is". A lot depends on exactly what you
call a "stack" and the relative efficiencies of stack vs. heap
implementations. Certainly my experience of library tuning tells
me that (with ghc at least), designing your code and data structures
to keep heap allocation down to an absolute minimum is very important.
So I'm very sceptical about claims that burning heap is just as
efficient. Heap allocation maybe just as cheap, but reclaiming costs
more.

A lot also depends on compiler (and associated rts), such as whether
or not it translates to CPS, thereby in effect building a "stack" (in
all but name) on the heap.

So you could exactly have the same correct and *bug free* program
giving a stack "overflow" on ghc, but not on a CPS based compiler
(the CPS implementation just uses a shed load of heap instead).

Other implementations (Chicken Scheme?) effectively build their
heap on the stack, which never shrinks until it "overflows". Is
that inherently buggy?

Surely the alleged buginess of programs should not be dependent
on choice of compiler/rst?

As nobody has provided any sensible justification for the assertion
that using lots of stack (vs. heap) inherently is bad (and that
programs which do have bugs, by definition), it seems to me this is
one of those quasi-religious beliefs, like (so called) "global
variables" or the cyclic module dependencies being a bug (by
definition).

Yes, using lots of stack is clearly bad with ghc, but this is a ghc
"bug". In fact the only reason these programs do use lots of stack
(vs. heap) is just a peculiarity of ghc rts implementation, so it
really should be ghc that fixes the problem, or at least admits
responsibility :-)

Regards
--
Adrian Hey




More information about the Haskell-Cafe mailing list