POSIX IO support. These types and functions correspond to the unix functions open(2), close(2), etc. For more portable functions which are more like fopen(3) and friends from stdio.h, see System.IO.
IOR monad is a wrapper around IO that allows region based resource management.
After GHC 7.2 a new `casMutVar#` primop became available, but was not yet exposed in Data.IORef. This package fills that gap until such a time as Data.IORef obsoletes it.
Further, in addition to exposing native Haskell CAS operations, this package contains "mockups" that imititate the same functionality using either atomicModifyIORef and unsafe pointer equality (in Data.CAS.Fake) or using foreign functions (Data.CAS.Foreign). These alternatives are useful for debugging.
Note that the foreign option does not operate on IORefs and so is directly interchangeable with `Data.CAS` and `Data.CAS.Fake` only if the interface in `Data.CAS.Class` is used.
This package consists of several modules, that give a pure specification of functions in the IO monad:
* Test.IOSpec.Fork: a pure specification of forkIO.
* Test.IOSpec.IORef: a pure specification of most functions that create and manipulate on IORefs.
* Test.IOSpec.MVar: a pure specification of most functions that create and manipulate and MVars.
* Test.IOSpec.STM: a pure specification of atomically and the STM monad.
* Test.IOSpec.Teletype: a pure specification of getChar, putChar, and several related Prelude functions.
Besides these modules containing the specifications, there are a few other important modules:
* Test.IOSpec.Types: defines the IOSpec type and several amenities.
* Test.IOSpec.VirtualMachine: defines a virtual machine on which to execute pure specifications.
* Test.IOSpec.Surrogate: a drop-in replacement for the other modules. Import this and recompile your code once you've finished testing and debugging.
There are several well-documented examples included with the source distribution.
capture IO action's stdout and stderr
Choice for IO and lifted IO
A skeleton library to help learners of Haskell concentrate on the pure-functional aspect and let the IO be handled by the library.
Transform an IO action into a similar IO action that performs the original action only once.
You can choose to perform the original action in one of three ways:
* lazily (might never be performed)
* concurrently (eager)
Special thanks to shachaf and headprogrammingczar from #haskell irc for helping me reason about the behavior of this library.
An API for generating reactive objects, as used in the TIMBER programming language.
This library allows an application to extend the 'global state' hidden inside the IO monad with semi-arbitrary data. Data is required to be Typeable. The library provides an essentially unbounded number of key-value stores indexed by strings, with each key within the stores also being a string.
The io-streams library contains simple and easy-to-use primitives for I/O using streams. Most users will want to import the top-level convenience module System.IO.Streams, which re-exports most of the library:
> import System.IO.Streams (InputStream, OutputStream)
> import qualified System.IO.Streams as Streams
For first-time users, io-streams comes with an included tutorial, which can be found in the System.IO.Streams.Tutorial module.
The io-streams user API has two basic types: InputStream a and OutputStream a, and three fundamental I/O primitives:
@ -- read an item from an input stream Streams.System.IO.Streams.read :: System.IO.Streams.InputStream a -> IO (Maybe a)
-- push an item back to an input stream Streams.System.IO.Streams.unRead :: a -> System.IO.Streams.InputStream a -> IO ()
-- write to an output stream Streams.System.IO.Streams.write :: Maybe a -> System.IO.Streams.OutputStream a -> IO () @
Streams can be transformed by composition and hooked together with provided combinators:
> ghci> Streams.fromList [1,2,3::Int] >>= Streams.map
> (*10) >>= Streams.toList
Stream composition leaves the original stream accessible:
> ghci> input <- Streams.fromByteString "long string"
> ghci> wrapped <- Streams.takeBytes 4 input
> ghci> Streams.read wrapped
> Just "long"
> ghci> Streams.read wrapped
> ghci> Streams.read input
> Just " string"
Simple types and operations in the IO monad mean straightforward and simple exception handling and resource cleanup using Haskell standard library facilities like Control.Exception.bracket.
io-streams comes with:
* functions to use files, handles, concurrent channels, sockets, lists, vectors, and more as streams.
* a variety of combinators for wrapping and transforming streams, including compression and decompression using zlib, controlling precisely how many bytes are read from or written to a stream, buffering output using bytestring builders, folds, maps, filters, zips, etc.
* support for parsing from streams using attoparsec.
* support for spawning processes and communicating with them using streams.
188.8.131.52: Allowed newest versions of the process, test-framework, and text libraries. 184.108.40.206: Fixed build error when compiled against attoparsec-0.10.0.x. 220.127.116.11: Added System.IO.Streams.Concurrent.makeChanPipe, to create a simple concurrent pipe between an InputStream/OutputStream pair. 18.104.22.168: Added System.IO.Streams.Network.socketToStreamsWithBufferSize, allowing control over the size of the receive buffers used when reading from sockets. 22.214.171.124: Fixed an inconsistent version upper bound in the test suite. 126.96.36.199: Fixed a typo in the tutorial. 188.8.131.52: A couple of Haddock markup fixes. 184.108.40.206: Reworked, simplified, and streamlined the internals of the library. Exports from System.IO.Streams.Internal relying on Sources and Sinks were deleted because they are no longer necessary: Source(..), Sink(..), defaultPushback, withDefaultPushback, nullSource, nullSink, singletonSource, simpleSource, sourceToStream, sinkToStream, generatorToSource, and consumerToSink. 220.127.116.11: Fixed a bug in which "takeBytes 0" was erroneously requesting input from the wrapped stream. 18.104.22.168: Fixed a compile error on GHC 7.0.x. 22.214.171.124: Added System.IO.Streams.Process (support for communicating with system processes using streams), added new functions to System.IO.Streams.Handle for converting io-streams types to System.IO.Handles. (Now you can pass streams from this library to places that expect Handles and everything will work.) 126.96.36.199: Added System.IO.Streams.Combinators.ignoreEof. 188.8.131.52: Fixed some haddock markup.
Package allowing type-safe I/O control
This module provides facilities for building transactions out of IO actions in such a way that, if one IO action in a transaction throws an exception, the effects of previous actions will be undone.
This package contains a Haskell representation and parser for ABC notation.
ABC notation is a text-based music notation system designed to be comprehensible by both people and computers. For more information see http://abcnotation.com.
Based on the 2.1 standard.
This package provides efficient conversion routines between a range of array types and Accelerate arrays.
Refer to the main Accelerate package for more information: http://hackage.haskell.org/package/accelerate
Sequence a set of Alternative actions in each possible order, based on "Parsing Permutation Phrases", by Arthur Baars, Andres Loeh and S. Doaitse Swierstra, Haskell Workshop 2001. This is particularly useful for constructing a parser for permutations of elements. This version has a slightly different interface from the paper.
ADPfusion combines stream-fusion (using the stream interface provided by the vector library) and type-level programming to provide highly efficient dynamic programming combinators.
From the programmers' viewpoint, ADPfusion behaves very much like the original ADP implementation http://bibiserv.techfak.uni-bielefeld.de/adp/ developed by Robert Giegerich and colleagues, though both combinator semantics and backtracking are different.
The library internals, however, are designed not only to speed up ADP by a large margin (which this library does), but also to provide further runtime improvements by allowing the programmer to switch over to other kinds of data structures with better time and space behaviour. Most importantly, dynamic programming tables can be strict, removing indirections present in lazy, boxed tables.
As a simple benchmark, consider the Nussinov78 algorithm which translates to three nested for loops (for C). In the figure, four different approaches are compared using inputs with size 100 characters to 1000 characters in increments of 100 characters. C is an implementation (.C directory) in C using "gcc -O3". ADP is the original ADP approach (see link above), while GAPC uses the GAP language (http://gapc.eu/).
Performance comparison figure: http://www.tbi.univie.ac.at/~choener/adpfusion/gaplike-nussinov-runtime.jpg
Please note that actual performance will depend much on table layout and data structures accessed during calculations, but in general performance is very good: close to C and better than other high-level approaches (that I know of).
Even complex ADP code tends to be completely optimized to loops that use only unboxed variables (Int# and others, indexIntArray# and others).
Completely novel (compared to ADP), is the idea of allowing efficient monadic combinators. This facilitates writing code that performs backtracking, or samples structures stochastically, among others things.
Two algorithms from the realm of computational biology are provided as examples on how to write dynamic programming algorithms using this library: http://hackage.haskell.org/package/Nussinov78 and http://hackage.haskell.org/package/RNAFold.
This package is deprecated, Please use io-choice.
Annotations provides utility functions to make working with annotated trees easier. There are two implementations: one for working with open datatypes that explicitly make their child positions accessible through a type argument, and one for working with MultiRec datatypes.
Parser combinators make it easy to construct trees annotated with position information. For the MultiRec implementation, there is the Yield monad that allows construction of trees in postorder.
Error algebras allow destruction of trees using catamorphisms. The algebra is allowed to indicate failure in which case the error is automatically coupled with the annotation at the position at which the error occurred.
Show more results