Lazy -bytestring
This module presents an identical interface to Control.Monad.ST, except that the monad delays evaluation of state operations until a value depending on them is required.
Mutable references in the lazy ST monad.
Lazy RWS monad.
Inspired by the paper Functional Programming with Overloading and Higher-Order Polymorphism, Mark P Jones (http://web.cecs.pdx.edu/~mpj/) Advanced School of Functional Programming, 1995.
Lazy state monads.
This module is inspired by the paper Functional Programming with Overloading and Higher-Order Polymorphism, Mark P Jones (http://web.cecs.pdx.edu/~mpj/) Advanced School of Functional Programming, 1995.
A monad transformer that combines ReaderT, WriterT and StateT. This version is lazy; for a strict version, see Control.Monad.Trans.RWS.Strict, which has the same interface.
Lazy state monads, passing an updatable state through a computation. See below for examples.
In this version, sequencing of computations is lazy. For a strict version, see Control.Monad.Trans.State.Strict, which has the same interface.
Some computations may not require the full power of state transformers:
* For a read-only state, see Control.Monad.Trans.Reader.
* To accumulate a value without using it on the way, see Control.Monad.Trans.Writer.
The lazy WriterT monad transformer, which adds collection of outputs (such as a count or string output) to a given monad.
This version builds its output lazily; for a strict version, see Control.Monad.Trans.Writer.Strict, which has the same interface.
This monad transformer provides only limited access to the output during the computation. For more general access, use Control.Monad.Trans.State instead.
An efficient implementation of maps from integer keys to values (dictionaries).
API of this module is strict in the keys, but lazy in the values. If you need value-strict maps, use Data.IntMap.Strict instead. The IntMap type itself is shared between the lazy and strict modules, meaning that the same IntMap value can be passed to functions in both modules (although that is rarely needed).
These modules are intended to be imported qualified, to avoid name clashes with Prelude functions, e.g.
> import Data.IntMap.Lazy (IntMap)
> import qualified Data.IntMap.Lazy as IntMap
The implementation is based on big-endian patricia trees. This data structure performs especially well on binary operations like union and intersection. However, my benchmarks show that it is also (much) faster on insertions and deletions when compared to a generic size-balanced map implementation (see Data.Map).
* Chris Okasaki and Andy Gill, "Fast Mergeable Integer Maps", Workshop on ML, September 1998, pages 77-86, http://citeseer.ist.psu.edu/okasaki98fast.html
* D.R. Morrison, "/PATRICIA -- Practical Algorithm To Retrieve Information Coded In Alphanumeric/", Journal of the ACM, 15(4), October 1968, pages 514-534.
Operation comments contain the operation time complexity in the Big-O notation http://en.wikipedia.org/wiki/Big_O_notation. Many operations have a worst-case complexity of O(min(n,W)). This means that the operation can become linear in the number of elements with a maximum of W -- the number of bits in an Int (32 or 64).
An efficient implementation of ordered maps from keys to values (dictionaries).
API of this module is strict in the keys, but lazy in the values. If you need value-strict maps, use Data.Map.Strict instead. The Map type itself is shared between the lazy and strict modules, meaning that the same Map value can be passed to functions in both modules (although that is rarely needed).
These modules are intended to be imported qualified, to avoid name clashes with Prelude functions, e.g.
> import qualified Data.Map.Lazy as Map
The implementation of Map is based on size balanced binary trees (or trees of bounded balance) as described by:
* Stephen Adams, "Efficient sets: a balancing act", Journal of Functional Programming 3(4):553-562, October 1993, http://www.swiss.ai.mit.edu/~adams/BB/.
* J. Nievergelt and E.M. Reingold, "Binary search trees of bounded balance", SIAM journal of computing 2(1), March 1973.
Note that the implementation is left-biased -- the elements of a first argument are always preferred to the second, for example in union or insert.
Operation comments contain the operation time complexity in the Big-O notation (http://en.wikipedia.org/wiki/Big_O_notation).
A time and space-efficient implementation of Unicode text using lists of packed arrays.
Note: Read below the synopsis for important notes on the use of this module.
The representation used by this module is suitable for high performance use and for streaming large quantities of data. It provides a means to manipulate a large body of text without requiring that the entire content be resident in memory.
Some operations, such as concat, append, reverse and cons, have better time complexity than their Data.Text equivalents, due to the underlying representation being a list of chunks. For other operations, lazy Texts are usually within a few percent of strict ones, but often with better heap usage if used in a streaming fashion. For data larger than available memory, or if you have tight memory constraints, this module will be the only option.
This module is intended to be imported qualified, to avoid name clashes with Prelude functions. eg.
> import qualified Data.Text.Lazy as L
This module provides access to the BSD socket interface. This module is generally more efficient than the String based network functions in Socket. For detailed documentation, consult your favorite POSIX socket reference. All functions communicate failures by converting the error number to IOError.
This module is made to be imported with Socket like so:
> import Network.Socket hiding (send, sendTo, recv, recvFrom)
> import Network.Socket.ByteString.Lazy
> import Prelude hiding (getContents)
Make lazy ByteStrings an instance of Stream with Char token type.
Make Text an instance of Stream with Char token type.
This provides Lazy instances for RegexMaker and RegexLike based on Text.Regex.Posix.Wrap, and a (RegexContext Regex ByteString ByteString) instance.
To use these instance, you would normally import Text.Regex.Posix. You only need to import this module to use the medium level API of the compile, regexec, and execute functions. All of these report error by returning Left values instead of undefined or error or fail.
A Lazy ByteString with more than one chunk cannot be be passed to the library efficiently (as a pointer). It will have to converted via a full copy to a temporary normal bytestring (with a null byte appended if necessary).
LazyVault is a sandboxing tool to install libraries and executables with a sandboxed environment. At the moment it's only supported under Unix or Gnu Systems. This package has only been tested under Gnu/Linux however. This program creates cabal sandboxes which you can use globally. For a detailed explaination on how this works refer to the README file found on the github page.
Version 0.0.1
The call '(lazy e)' means the same as e, but lazy has a magical strictness property: it is lazy in its first argument, even though its semantics is strict.
Convert a lazy ST computation into a strict one.
The CSV format is defined by RFC 4180. These efficient lazy parsers (String and ByteString variants) can report all CSV formatting errors, whilst also returning all the valid data, so the user can choose whether to continue, to show warnings, or to halt on error. Valid fields retain information about their original location in the input, so a secondary parser from textual fields to typed values can give intelligent error messages.
Version 0.5
The library provides some basic but useful lazy IO functions. Keep in mind that lazy IO is generally discouraged. Perhaps a coroutine library (e.g. pipes) will better suit your needs.
Version 0.1.0
This package built on standard array package adds support for lazy monolithic arrays. Such arrays are lazy not only in their values, but in their indexes as well. Read the paper "Efficient Graph Algorithms Using Lazy Monolithic Arrays" (http://citeseer.ist.psu.edu/95126.html) for further details.
Version 0.1.3
lazyBufferOp is the BufferOp definition over ByteStrings, the non-strict kind.
Check the invariant lazily.
Run IO actions lazily while respecting their order. Running a value of the LazyIO monad in the IO monad is like starting a thread which is however driven by its output. That is, the LazyIO action is only executed as far as necessary in order to provide the required data.
Version 0.0.3.2
Lazy SmallCheck is a library for exhaustive, demand-driven testing of Haskell programs. It is based on the idea that if a property holds for a partially-defined input then it must also hold for all fully-defined refinements of the that input. Compared to ``eager'' input generation as in SmallCheck, Lazy SmallCheck may require significantly fewer test-cases to verify a property for all inputs up to a given depth.
Version 0.6
See the source of Numeric.LazySplines.Examples for usage.
Version 0.1
Convert a strict ST computation into a lazy one. The strict state thread passed to strictToLazyST is not performed until the result of the lazy state thread it returns is demanded.
O(1). A Builder taking a lazy Text, satisfying
* (fromLazyText t) =
Support for lazy computations which consume random values.
Version 0.1
Num, Enum, Eq, Integral, Ord, Real, and Show instances for Lazy ByteStrings
Version 0.0.0.1
Provides a safer API for incremental IO processing in a way very close to standard lazy IO.
Version 0.1
O(n). Extract a lazy Text from a Builder with a default buffer size. The construction work takes place if and when the relevant part of the lazy Text is demanded.
O(n). Extract a lazy Text from a Builder, using the given size for the initial buffer. The construction work takes place if and when the relevant part of the lazy Text is demanded.
If the initial buffer is too small to hold all data, subsequent buffers will be the default buffer size.
Functor-lazy vectors perform the fmap operation in constant time, whereas other vectors require linear time. All vector operations are supported except for slicing. See http://github.com/mikeizbicki/vector-funxtorlazy for details on how this module works under the hood.
Version 0.0.1