Debian library packaging? or, how to get a single deb to work for hugs/ghc/nhc?

Shae Matijs Erisson shae@ScannedInAvian.com
Wed, 29 Jan 2003 19:33:04 +0100


Isaac Jones <ijones@syntaxpolice.org> writes:

>> The whole idea with binary distribution is to compile things once and
>> let others download and install the binary and be done with it.
>> Compiling Haskell programs of moderate size on a not-very-recent
>> computer takes quite some time, especially if you want to optimize too,
>> and will mean a *very slow* installation procedure. That simply is
>> unacceptable. Byte-compiling, like e.g. Python does is in my experience
>> a lot faster.

> But this does bring up something that I admit I hadn't really thought
> much about: in stable, the new compiler releases won't really be a
> problem, but the different compilers (ghc, nhc, hugs) would still be a
> problem.
>
> So am I hearing that others would prefer to have packages like:
> hunit-ghc5.04.2, hunit-ghc4, hunit-hugs, hunit-nhc, etc?

How long does it take to compile packages on various hardware?
Does anyone have numbers?
I'd prefer the emacs/python compile upon install solution, since that would cut
down on the number of packages.
What about profiling for GHC? Each library will then require profiling
versions, right? That list above of hunit packages would get longer.

As you suggested in IRC, what about having source packages that can compile
themselves upon installation, and binary debs for the really big libs that
would take a long time to build?
I don't know if that would be difficult to do for a deb.
That would solve the problem of binary debs being incompatible with each minor
version change in GHC.
Does NHC have the same compatibility?

I'll come up with some numbers for compiling libs on an Athlon 800MHz I have
handy. Is that okay as an average machine?
-- 
Shae Matijs Erisson - 2 days older than RFC0226
#haskell on irc.freenode.net - We Put the Funk in Funktion