Stop using "Int" for microsecond delays in "base"

Paul Johnson paul at cogito.org.uk
Sat Mar 26 18:20:54 CET 2011


The "base" library has the "threadDelay" primitive, which takes an Int 
argument in microseconds.  Unfortunately this means that the longest 
delay you can get on a 32 bit machine with GHC is just under 36 minutes 
(2^31 uSec), and a hypothetical compiler that only used 30 bit integers 
(as per the standard) would get under 10 minutes.  It is a bit tricky to 
write general-purpose libraries with this.  I think that there should be a

    type Delay = Int64

declaration, and that threadDelay and related functions should take that 
as an argument type.

Paul.



More information about the Libraries mailing list