[Haskell-cafe] How far compilers are allowed to go with optimizations?

Johan Holmquist holmisen at gmail.com
Mon Feb 11 17:47:48 CET 2013


I was about to leave this topic not to swamp the list with something
that appears to go nowere. But now I feel that I must answer the
comments, so here it goes.

By "agressive optimisation" I mean an optimisation that drastically
reduces run-time performance of (some part of) the program. So I guess
automatic vectorisation could fall into this term.

In my scenario the original programmer did not have any reason for the
strange code -- it just happened. I'm sure we all write bad/suboptimal
code sometimes. Hence there would be no comment describing this in the
code. And compiling without optimisations would not prevent the above
-- only if the original programmer(s) had compiled without
optimisations but this was obviously not the case.

We seem to agree that the only way to know what a change will do to
the performance of a program is testing (using whatever tools are
available). That is what makes me a bit uncomfortable because then we
don't really understand it. Disabling all optimisations would
definitely be a step backwards and not where we want to go.

Then again, maybe this problem is mostly of theoretical concern and
not at all common in practice. I guess it **is** far more likely that
clean code will trigger optimisations. But we cannot really know for
sure...

Regards
Johan



More information about the Haskell-Cafe mailing list