Difference between revisions of "Applications and libraries/Concurrency and parallelism"

From HaskellWiki
Jump to navigation Jump to search
(→‎Software transactional memory: fixed links to papers)
Line 37: Line 37:
 
:GHC supports a sophisticated version of software transactional memory. Software Transactional Memory (STM) is a new way to coordinate concurrent threads.
 
:GHC supports a sophisticated version of software transactional memory. Software Transactional Memory (STM) is a new way to coordinate concurrent threads.
 
* [http://haskell.org/ghc/docs/latest/html/libraries/stm/Control-Concurrent-STM.html Documentation]
 
* [http://haskell.org/ghc/docs/latest/html/libraries/stm/Control-Concurrent-STM.html Documentation]
* The paper [http://research.microsoft.com/~simonpj/papers/stm/index.htm Composable memory transactions].
+
* The paper [http://research.microsoft.com/en-us/um/people/simonpj/papers/stm/stm.pdf Composable memory transactions].
* The paper [http://research.microsoft.com/~simonpj/papers/stm/lock-free.htm Lock-free data structures using Software Transactional Memory in Haskell] gives further examples of concurrent programming using STM.
+
* The paper [http://research.microsoft.com/en-us/um/people/simonpj/papers/stm/lock-free-flops06.pdf Lock-free data structures using Software Transactional Memory in Haskell] gives further examples of concurrent programming using STM.
   
 
== Parallel Strategies ==
 
== Parallel Strategies ==

Revision as of 04:24, 22 December 2010

Concurrent and Parallel Programming

Haskell has been designed for parallel and concurrent programming since its inception. In particular, Haskell's purity greatly simplifies reasoning about parallel programs. This page lists libraries and extensions for programming concurrent and parallel applications in Haskell. See also the research papers on parallel and concurrent Haskell.

Collected tutorials and information on multicore programming with Haskell:

SMP Haskell

Multiprocessor GHC
As of GHC 6.5, GHC supports running programs in parallel on an SMP or multicore machine, and has been used successfully on machines with up to 48 cores.

Concurrency

Concurrent Haskell
GHC has supported concurrency with lightweight threads for more than a decade, and it is very fast. Threads in Haskell are preemptively scheduled and support everything you would normally expect from threads, including blocking I/O and foreign calls.
Wrapped Concurrency
A wrapper around Control.Concurrency and Control.Exception that provides versions of forkIO that have more guarantees.

Concurrent channels

Control.Concurrent.Chan
channels allow threads to pass messages to each other

Software transactional memory

Software Transactional Memory
GHC supports a sophisticated version of software transactional memory. Software Transactional Memory (STM) is a new way to coordinate concurrent threads.

Parallel Strategies

Strategies provide a high-level compositional API for parallel programming.

Low-level parallelism: par and pseq

The Control.Parallel module provides the low-level operations for parallelism on which Strategies are built.

Data Parallel Haskell

Data Parallel Haskell
Implicitly parallel, high performance (nested) arrays, supporting large multicore programming.

Communicating Haskell Processes

CHP: Communicating Haskell Processes
CHP is built on the ideas of CSP (Communicating Sequential Processes), featuring encapsulated parallel processes (no shared data!) communicating over synchronous channels. This is a very composable mode that also allows choice on communications, so that a process may offer to either read on one channel or write on another, but will only take the first that is available.

Actors

Actors with multi-headed receive clauses
Actor-based concurrency for Haskell

Transactional events

Transactional events, based on Concurrent ML
Transactional events for Haskell.

Unified events and threads

User-level events and threads
Ultra lightweight, user level threads for GHC Haskell, layered over epoll. Supports up to 10 million lightweight threads. Experimental.

Feedback-directed implicit parallelism

Implicit parallelism in Haskell, and a feedback-directed mechanism to increase its granularity

Parallel Haskell

GpH: Glasgow Parallel Haskell
A complete, GHC-based implementation of the parallel Haskell extension GpH and of evaluation strategies is available. Extensions of the runtime-system and language to improve performance and support new platforms are under development.

Distributed Haskell

GdH: Glasgow Distributed Haskell
GdH supports distributed stateful interactions on multiple locations. It is a conservative extension of both Concurrent Haskell and GpH, enabling the distribution of the stateful IO threads of the former on the multiple locations of the latter. The programming model includes forking stateful threads on remote locations, explicit communication over channels, and distributed exception handling.
Mobile Haskell (mHaskell)
Mobile Haskell supports both strong and weak mobility of computations across open networks. The mobility primitives are higher-order polymorphic channels. Mid-level abstractions like remote evaluation, analogous to Java RMI, are readily constructed. High-level mobility skeletons like mobile map and mobile fold encapsulate common patterns of mobile computation.
Eden
Eden extends Haskell with a small set of syntactic constructs for explicit process specification and creation. While providing enough control to implement parallel algorithms efficiently, it frees the programmer from the tedious task of managing low-level details by introducing automatic communication (via head-strict lazy lists), synchronisation, and process handling.

MPI

hMPI
hMPI is an acronym for HaskellMPI. It is a Haskell binding conforming to MPI (Message Passing Interface) standard 1.1/1.2. The programmer is in full control over the communication between the nodes of a cluster.
Haskell-MPI
Haskell-MPI provides a Haskell interface to MPI, built on top of the foreign function interface. It is notionally a descendant of hMPI, but is mostly a rewrite.

Distributed Haskell: Ports

The Haskell Ports Library (HPL)
Ports are an abstraction for modelling variables whose values evolve over time without the need to resort to mutable variable, such as IORefs. More precisely, a port represents all values that a time-dependent variable successively takes as a stream, where each element of the stream corresponds to a state change - we can also say that a port represents a time series. Moreover, a port supports concurrent construction of the time series, or stream of values.

Modelling concurrent and distributed systems

HCPN: Haskell-Coloured Petri Nets
Haskell-Coloured Petri Nets (HCPN) are an instance of high-level Petri Nets, in which anonymous tokens are replaced by Haskell data objects (and transitions can operate on that data, in addition to moving it around). This gives us a hybrid graphical/textual modelling formalism for Haskell, especially suited for modelling concurrent and distributed systems.