Colocation of Potential Parallelism in a Distributed Adaptive Run-Time System for Parallel Haskell

Research output: Chapter in Book/Report/Conference proceedingConference contribution


This paper presents a novel variant of work stealing for load balancing in a distributed graph reducer, executing a semi-explicit parallel dialect of Haskell. The key concept of this load-balancer is colocating related sparks (potential parallelism) using maximum prefix matching on the encoding of the spark’s ancestry within the computation tree, reconstructed at run time, in spark selection decisions. We evaluate spark colocation in terms of performance and scalability on a set of five benchmarks on a Beowulf-class cluster of multi-core machines using up to 256 cores. In comparison to the baseline mechanism, we achieve speedup increase of up to 46% for three out of five applications, due to improved locality and load balance throughout the execution as demonstrated by profiling data. For one less scalable program and one program with excessive amounts of very fine-grained parallelism we observe drops in speedup by 17% and 42%, respectively. Overall, spark colocation results in reduced mean time to fetch the required data and in higher degree of parallelism of finer granularity, which is most beneficial on higher PE numbers.

Original languageEnglish
Title of host publicationTrends in Functional Programming
Subtitle of host publication19th International Symposium, TFP 2018, Gothenburg, Sweden, June 11–13, 2018, Revised Selected Papers
EditorsMichał Pałka, Magnus Myreen
Number of pages19
ISBN (Electronic)978-3-030-18506-0
Publication statusE-pub ahead of print - 24 Apr 2019
Event19th International Symposium on Trends in Functional Programming 2018 - Chalmers University, Gothenburg, Sweden
Duration: 11 Jun 201813 Jun 2018

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743


Conference19th International Symposium on Trends in Functional Programming 2018
Abbreviated titleTFP 2018
Internet address


  • Adaptive parallelism
  • Distributed-memory work stealing
  • Graph reduction
  • Load balancing
  • Parallel functional programming

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'Colocation of Potential Parallelism in a Distributed Adaptive Run-Time System for Parallel Haskell'. Together they form a unique fingerprint.

Cite this