Understanding Priority-Based Scheduling of Graph Algorithms on a Shared-Memory Platform

Many task-based graph algorithms benefit from executing tasks according to some programmer-specified priority order. To support such algorithms, graph frameworks use Concurrent Priority Schedulers (CPSs), which attempt-but do not guarantee-to execute the tasks according to their priority order. Whil...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:SC19: International Conference for High Performance Computing, Networking, Storage and Analysis S. 1 - 14
Hauptverfasser: Yesil, Serif, Heidarshenas, Azin, Morrison, Adam, Torrellas, Josep
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: ACM 17.11.2019
Schlagworte:
ISSN:2167-4337
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Many task-based graph algorithms benefit from executing tasks according to some programmer-specified priority order. To support such algorithms, graph frameworks use Concurrent Priority Schedulers (CPSs), which attempt-but do not guarantee-to execute the tasks according to their priority order. While CPSs are critical to performance, there is insufficient insight on the relative strengths and weaknesses of the different CPS designs in the literature. Such insights would be valuable to design better CPSs for graph processing. This paper addresses this problem. It performs a detailed empirical performance analysis of several advanced CPS designs in a state-of-the-art graph analytics framework running on a large shared-memory server. Our analysis finds that all CPS designs but one impose major overheads that dominate running time. Only one CPS-the Galois system's obim-typically imposes negligible overheads. However, obim's performance is input-dependent and can degrade substantially for some inputs. Based on our insights, we develop PMOD, a new CPS that is robust and delivers the highest performance overall.
ISSN:2167-4337
DOI:10.1145/3295500.3356160