cvs commit: fptools/ghc/compiler/stranal DmdAnal.lhs
Mon, 6 Aug 2001 10:02:14 +0100
> Thu, 2 Aug 2001 09:31:13 -0700, Simon Marlow=20
> <firstname.lastname@example.org> pisze:
> > Turn the strictness analyser back on again.
> Why ghc still needs huge amounts of memory to compile itself (after
> bootstrapping a few times)?
Using '+RTS -c -RTS' helps quite a bit until we can sort this out.
> BTW, options like -H16M when the number is smaller than 64 seems to
> be ignored. (Such options are used in various places, so I expect
> them to break when they become respected.)
> AFAIK there is no way to say: use up to 120MB of heap, but don't
> allocate such big heap at once, only if needed. This strategy works
> automatically until the hard-coded limit of 64MB, but can't be used
> above that. IMHO it would be desirable to be able to specify that.
It appears that no-one understands the GC options except me. Perhpas
they should be redesigned, but here's the current story:
-H<size> is the *minimum* heap size. The heap will be grown as
necessary, starting with the minimum, and up to the maximum heap size.
-M<size> is the *maximum* heap size, by default 64M. The heap is not
allowed to grow beyond this size (but note that the calculation is
inaccurate in that it currently takes into account only the oldest
generation, so if you have more than 2 generations you might well end up
using more memory). The reason for having a maximum heap size is that
otherwise a program with a space leak might eat all your swap, and may
even cause other processes to be killed, depending on the OS's
So, to start with a small heap that is allowed to grow up to 120M, you
just need to say +RTS -M120M.