lazy

lazy :: a -> a
base GHC.Exts
The call '(lazy e)' means the same as e, but lazy has a magical strictness property: it is lazy in its first argument, even though its semantics is strict.
lazyToStrictST :: ST s a -> ST s a
base Control.Monad.ST.Lazy
Convert a lazy ST computation into a strict one.
package lazy-csv
package
The CSV format is defined by RFC 4180. These efficient lazy parsers (String and ByteString variants) can report all CSV formatting errors, whilst also returning all the valid data, so the user can choose whether to continue, to show warnings, or to halt on error.  Valid fields retain information about their original location in the input, so a secondary parser from textual fields to typed values can give intelligent error messages. Version 0.5
package lazy-io
package
The library provides some basic but useful lazy IO functions. Keep in mind that lazy IO is generally discouraged. Perhaps a coroutine library (e.g. pipes) will better suit your needs. Version 0.1.0
package lazyarray
package
This package built on standard array package adds support for lazy monolithic arrays. Such arrays are lazy not only in their values, but in their indexes as well. Read the paper "Efficient Graph Algorithms Using Lazy Monolithic Arrays" (http://citeseer.ist.psu.edu/95126.html) for further details. Version 0.1.3
lazyBufferOp :: BufferOp ByteString
HTTP Network.BufferType
lazyBufferOp is the BufferOp definition over ByteStrings, the non-strict kind.
lazyByteString :: ByteString -> Builder
bytestring Data.ByteString.Builder
Create a Builder denoting the same sequence of bytes as a lazy ByteString. The Builder inserts large chunks of the lazy ByteString directly, but copies small ones to ensure that the generated chunks are large on average.
lazyByteStringCopy :: ByteString -> Builder
bytestring Data.ByteString.Builder.Extra
Construct a Builder that copies the lazy ByteString.
lazyByteStringHex :: ByteString -> Builder
bytestring Data.ByteString.Builder
Encode each byte of a lazy ByteString using its fixed-width hex encoding.
lazyByteStringInsert :: ByteString -> Builder
bytestring Data.ByteString.Builder.Extra
Construct a Builder that inserts all chunks of the lazy ByteString directly.
lazyByteStringThreshold :: Int -> ByteString -> Builder
bytestring Data.ByteString.Builder.Extra
Construct a Builder that uses the thresholding strategy of byteStringThreshold for each chunk of the lazy ByteString.
lazyInvariant :: Text -> Text
text Data.Text.Lazy.Internal
Check the invariant lazily.
package lazyio
package
Run IO actions lazily while respecting their order. Running a value of the LazyIO monad in the IO monad is like starting a thread which is however driven by its output. That is, the LazyIO action is only executed as far as necessary in order to provide the required data. Version 0.0.3.2
package lazysmallcheck
package
Lazy SmallCheck is a library for exhaustive, demand-driven testing of Haskell programs.  It is based on the idea that if a property holds for a partially-defined input then it must also hold for all fully-defined refinements of the that input.  Compared to ``eager'' input generation as in SmallCheck, Lazy SmallCheck may require significantly fewer test-cases to verify a property for all inputs up to a given depth. Version 0.6
package lazysplines
package
See the source of Numeric.LazySplines.Examples for usage. Version 0.1
module Control.Monad.ST.Lazy
base Control.Monad.ST.Lazy
This module presents an identical interface to Control.Monad.ST, except that the monad delays evaluation of state operations until a value depending on them is required.
module Data.STRef.Lazy
base Data.STRef.Lazy
Mutable references in the lazy ST monad.
module Control.Monad.RWS.Lazy
mtl Control.Monad.RWS.Lazy
Lazy RWS monad. Inspired by the paper Functional Programming with Overloading and Higher-Order Polymorphism, Mark P Jones (http://web.cecs.pdx.edu/~mpj/) Advanced School of Functional Programming, 1995.
module Control.Monad.State.Lazy
mtl Control.Monad.State.Lazy
Lazy state monads. This module is inspired by the paper Functional Programming with Overloading and Higher-Order Polymorphism, Mark P Jones (http://web.cecs.pdx.edu/~mpj/) Advanced School of Functional Programming, 1995.
module Control.Monad.Trans.RWS.Lazy
transformers Control.Monad.Trans.RWS.Lazy
A monad transformer that combines ReaderT, WriterT and StateT. This version is lazy; for a strict version, see Control.Monad.Trans.RWS.Strict, which has the same interface.
module Control.Monad.Trans.State.Lazy
transformers Control.Monad.Trans.State.Lazy
Lazy state monads, passing an updatable state through a computation. See below for examples. In this version, sequencing of computations is lazy. For a strict version, see Control.Monad.Trans.State.Strict, which has the same interface. Some computations may not require the full power of state transformers: * For a read-only state, see Control.Monad.Trans.Reader. * To accumulate a value without using it on the way, see Control.Monad.Trans.Writer.
module Control.Monad.Trans.Writer.Lazy
transformers Control.Monad.Trans.Writer.Lazy
The lazy WriterT monad transformer, which adds collection of outputs (such as a count or string output) to a given monad. This version builds its output lazily; for a strict version, see Control.Monad.Trans.Writer.Strict, which has the same interface. This monad transformer provides only limited access to the output during the computation. For more general access, use Control.Monad.Trans.State instead.
module Control.Monad.Writer.Lazy
mtl Control.Monad.Writer.Lazy
Lazy writer monads. Inspired by the paper Functional Programming with Overloading and Higher-Order Polymorphism, Mark P Jones (http://web.cecs.pdx.edu/~mpj/pubs/springschool.html) Advanced School of Functional Programming, 1995.
module Data.ByteString.Lazy
bytestring Data.ByteString.Lazy
A time and space-efficient implementation of lazy byte vectors using lists of packed Word8 arrays, suitable for high performance use, both in terms of large data quantities, or high speed requirements. Lazy ByteStrings are encoded as lazy lists of strict chunks of bytes. A key feature of lazy ByteStrings is the means to manipulate large or unbounded streams of data without requiring the entire sequence to be resident in memory. To take advantage of this you have to write your functions in a lazy streaming style, e.g. classic pipeline composition. The default I/O chunk size is 32k, which should be good in most circumstances. Some operations, such as concat, append, reverse and cons, have better complexity than their Data.ByteString equivalents, due to optimisations resulting from the list spine structure. For other operations lazy ByteStrings are usually within a few percent of strict ones. The recomended way to assemble lazy ByteStrings from smaller parts is to use the builder monoid from Data.ByteString.Lazy.Builder. This module is intended to be imported qualified, to avoid name clashes with Prelude functions. eg. > import qualified Data.ByteString.Lazy as B Original GHC implementation by Bryan O'Sullivan. Rewritten to use UArray by Simon Marlow. Rewritten to support slices and use ForeignPtr by David Roundy. Rewritten again and extended by Don Stewart and Duncan Coutts. Lazy variant by Duncan Coutts and Don Stewart.
module Data.IntMap.Lazy
containers Data.IntMap.Lazy
An efficient implementation of maps from integer keys to values (dictionaries). API of this module is strict in the keys, but lazy in the values. If you need value-strict maps, use Data.IntMap.Strict instead. The IntMap type itself is shared between the lazy and strict modules, meaning that the same IntMap value can be passed to functions in both modules (although that is rarely needed). These modules are intended to be imported qualified, to avoid name clashes with Prelude functions, e.g. > import Data.IntMap.Lazy (IntMap) > import qualified Data.IntMap.Lazy as IntMap The implementation is based on big-endian patricia trees. This data structure performs especially well on binary operations like union and intersection. However, my benchmarks show that it is also (much) faster on insertions and deletions when compared to a generic size-balanced map implementation (see Data.Map). * Chris Okasaki and Andy Gill, "Fast Mergeable Integer Maps", Workshop on ML, September 1998, pages 77-86, http://citeseer.ist.psu.edu/okasaki98fast.html * D.R. Morrison, "/PATRICIA -- Practical Algorithm To Retrieve Information Coded In Alphanumeric/", Journal of the ACM, 15(4), October 1968, pages 514-534. Operation comments contain the operation time complexity in the Big-O notation http://en.wikipedia.org/wiki/Big_O_notation. Many operations have a worst-case complexity of O(min(n,W)). This means that the operation can become linear in the number of elements with a maximum of W -- the number of bits in an Int (32 or 64).
module Data.Map.Lazy
containers Data.Map.Lazy
An efficient implementation of ordered maps from keys to values (dictionaries). API of this module is strict in the keys, but lazy in the values. If you need value-strict maps, use Data.Map.Strict instead. The Map type itself is shared between the lazy and strict modules, meaning that the same Map value can be passed to functions in both modules (although that is rarely needed). These modules are intended to be imported qualified, to avoid name clashes with Prelude functions, e.g. > import qualified Data.Map.Lazy as Map The implementation of Map is based on size balanced binary trees (or trees of bounded balance) as described by: * Stephen Adams, "Efficient sets: a balancing act", Journal of Functional Programming 3(4):553-562, October 1993, http://www.swiss.ai.mit.edu/~adams/BB/. * J. Nievergelt and E.M. Reingold, "Binary search trees of bounded balance", SIAM journal of computing 2(1), March 1973. Note that the implementation is left-biased -- the elements of a first argument are always preferred to the second, for example in union or insert. Operation comments contain the operation time complexity in the Big-O notation (http://en.wikipedia.org/wiki/Big_O_notation).
module Data.Text.Lazy
text Data.Text.Lazy
A time and space-efficient implementation of Unicode text using lists of packed arrays. Note: Read below the synopsis for important notes on the use of this module. The representation used by this module is suitable for high performance use and for streaming large quantities of data. It provides a means to manipulate a large body of text without requiring that the entire content be resident in memory. Some operations, such as concat, append, reverse and cons, have better time complexity than their Data.Text equivalents, due to the underlying representation being a list of chunks. For other operations, lazy Texts are usually within a few percent of strict ones, but often with better heap usage if used in a streaming fashion. For data larger than available memory, or if you have tight memory constraints, this module will be the only option. This module is intended to be imported qualified, to avoid name clashes with Prelude functions. eg. > import qualified Data.Text.Lazy as L
module Network.Socket.ByteString.Lazy
network Network.Socket.ByteString.Lazy
This module provides access to the BSD socket interface. This module is generally more efficient than the String based network functions in Socket. For detailed documentation, consult your favorite POSIX socket reference. All functions communicate failures by converting the error number to IOError. This module is made to be imported with Socket like so: > import Network.Socket hiding (send, sendTo, recv, recvFrom) > import Network.Socket.ByteString.Lazy > import Prelude hiding (getContents)
module Text.Parsec.ByteString.Lazy
parsec Text.Parsec.ByteString.Lazy
Make lazy ByteStrings an instance of Stream with Char token type.
module Text.Parsec.Text.Lazy
parsec Text.Parsec.Text.Lazy
Make Text an instance of Stream with Char token type.
module Text.Regex.Posix.ByteString.Lazy
regex-posix Text.Regex.Posix.ByteString.Lazy
This provides Lazy instances for RegexMaker and RegexLike based on Text.Regex.Posix.Wrap, and a (RegexContext Regex ByteString ByteString) instance. To use these instance, you would normally import Text.Regex.Posix. You only need to import this module to use the medium level API of the compile, regexec, and execute functions. All of these report error by returning Left values instead of undefined or error or fail. A Lazy ByteString with more than one chunk cannot be be passed to the library efficiently (as a pointer). It will have to converted via a full copy to a temporary normal bytestring (with a null byte appended if necessary).
package LazyVault
package
LazyVault is a sandboxing tool to install libraries and executables with a sandboxed environment. At the moment it's only supported under Unix or Gnu Systems. This package has only been tested under Gnu/Linux however. This program creates cabal sandboxes which you can use globally. For a detailed explaination on how this works refer to the README file found on the github page. Version 0.0.1
strictToLazyST :: ST s a -> ST s a
base Control.Monad.ST.Lazy
Convert a strict ST computation into a lazy one. The strict state thread passed to strictToLazyST is not performed until the result of the lazy state thread it returns is demanded.
fromLazyText :: Text -> Builder
text Data.Text.Lazy.Builder
O(1). A Builder taking a lazy Text, satisfying *  (fromLazyText t) =
package MonadRandomLazy
package
Support for lazy computations which consume random values. Version 0.1
package NumLazyByteString
package
Num, Enum, Eq, Integral, Ord, Real, and Show instances for Lazy ByteStrings Version 0.0.0.1
primMapLazyByteStringBounded :: BoundedPrim Word8 -> ByteString -> Builder
bytestring Data.ByteString.Builder.Prim
Chunk-wise application of primMapByteStringBounded.
primMapLazyByteStringFixed :: FixedPrim Word8 -> (ByteString -> Builder)
bytestring Data.ByteString.Builder.Prim
Heavy inlining. Encode all bytes of a lazy ByteString from left-to-right with a FixedPrim.
RTLD_LAZY :: RTLDFlags
unix System.Posix.DynamicLinker.Prim
package safe-lazy-io
package
Provides a safer API for incremental IO processing in a way very close to standard lazy IO. Version 0.1

Show more results