Personal tools

ThreadScope Tour/SparkOverview

From HaskellWiki

< ThreadScope Tour(Difference between revisions)
Jump to: navigation, search
m (Things to look for)
m (Spark viewer features)
 
(3 intermediate revisions by one user not shown)
Line 9: Line 9:
 
= Spark viewer features =
 
= Spark viewer features =
   
[[Image:ThreadScope-spark-creation-conversion.png|400px|spark creation/conversion]]
+
Review spark creation and creation rates:
   
[[Image:ThreadScope-spark-pool.png|400px|spark pool]]
+
[[Image:ThreadScope-spark-creation-conversion.png|300px|spark creation/conversion]]
   
[[Image:ThreadScope-spark-size-focus.png|400px|spark size histogram]]
+
Track the size of the spark pool:
  +
  +
[[Image:ThreadScope-spark-pool.png|300px|spark pool]]
  +
  +
See the distribution of sparks grouped by their sizes:
  +
  +
[[Image:ThreadScope-spark-size-focus.png|300px|spark size histogram]]
   
 
= Things to look for =
 
= Things to look for =
Line 22: Line 22:
 
#* spark pool hits empty
 
#* spark pool hits empty
 
#* low spark creation rate
 
#* low spark creation rate
 
 
# Too many sparks
 
# Too many sparks
 
#* overflow (red) is wasted work
 
#* overflow (red) is wasted work
 
#* can cause catastrophic loss of parallelism
 
#* can cause catastrophic loss of parallelism
 
 
# Too many duds or fizzled sparks (grey)
 
# Too many duds or fizzled sparks (grey)
 
# Too many sparks get GC'd (orange)
 
# Too many sparks get GC'd (orange)
 
# Sparks too small (overheads too high)
 
# Sparks too small (overheads too high)
# Sparks too big (load balancing problems '''TODO''' : link to Sudoku
+
# Sparks too big (load balancing problems, eg. sudoku2)
   
In the following sections we will walk through some examples of attempts to diagnose these problems.
+
In the [[ThreadScope_Tour/Spark|following sections]] we will walk through some examples of attempts to diagnose these problems.

Latest revision as of 17:22, 7 December 2011

[edit] 1 ThreadScope and sparks

ThreadScope 0.2.1 and higher come with spark event visualisations that help you to understand not just what behaviours your parallel program is exhibiting (eg. not using all cores) but why.

It helps to know a bit about sparks:

spark lifecycle

[edit] 2 Spark viewer features

Review spark creation and creation rates:

spark creation/conversion

Track the size of the spark pool:

spark pool

See the distribution of sparks grouped by their sizes:

spark size histogram

[edit] 3 Things to look for

The combination of features can be used to look for some common problems such as

  1. Too few sparks (not enough parallelism)
    • spark pool hits empty
    • low spark creation rate
  2. Too many sparks
    • overflow (red) is wasted work
    • can cause catastrophic loss of parallelism
  3. Too many duds or fizzled sparks (grey)
  4. Too many sparks get GC'd (orange)
  5. Sparks too small (overheads too high)
  6. Sparks too big (load balancing problems, eg. sudoku2)

In the following sections we will walk through some examples of attempts to diagnose these problems.