DPH libs are re-haddocked every time
duncan.coutts at googlemail.com
Thu Jan 7 05:52:38 EST 2010
On Thu, 2009-12-31 at 14:34 +0000, Simon Marlow wrote:
> If you have HADDOCK_DOCS enabled, then every 'make' will re-haddock the
> DPH libraries, which is rather annoying.
> I've done some investigation, here's what I've discovered:
> * When we run Haddock on DPH, Haddock must actually compile
> to object code, because they involve TH (actually annotations).
> * Haddock doesn't get to use the existing object code, because
> Cabal first copies every source file into a temporary location
> during the preprocessing phase.
> * The new .o files overwrite the old .o files, because Cabal is
> passing the same -odir -osuf etc. flags to Haddock that it uses
> when compiling. I'm not sure, but I think the DPH libs that we ship
> in 6.12.1 are actually compiled by Haddock :-)
> * The next time make is invoked, the object files have been touched,
> which causes a knock-on effect requiring Haddock to be run again.
> Now, the way I think it should work is that we shoudn't be doing any
> preprocessing at all, and Haddock should be run on the same source files
> that the build system is using. Hence Haddock will get to re-use the
> existing object files, things will be much quicker, and Haddock won't
> touch anything it shouldn't.
I don't think haddock should be doing any pre-processing, but certainly
we should not be telling it to generate the same .o files as we do for
the main build.
So if we changed the way Cabal invokes haddock to point it to a
different directory for object files, would that be sufficient?
We can't have haddock do the pre-processing because Cabal does it
differently from haddock. Cabal uses a different unlit. Cabal also
handles pre-processors beyond unlit and cpp like alex, happy etc.
More information about the Cvs-ghc