Message Passing Interface (MPI)

The goal of the Message Passing Interface (MPI) is to provide a standard library of routines for writing portable and efficient message passing programs. MPI is not a language; it is a specification of a library of routines that can be called from programs. MPI provides a rich collection of point‐to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Advanced Computer Architecture and Parallel Processing S. 205 - 233
Hauptverfasser: El‐Rewini, Hesham, Abd‐El‐Barr, Mostafa
Format: Buchkapitel
Sprache:Englisch
Veröffentlicht: Hoboken, NJ, USA John Wiley & Sons, Inc 17.12.2004
Schriftenreihe:Wiley Series on Parallel and Distributed Computing
Schlagworte:
ISBN:9780471467403, 0471467405
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The goal of the Message Passing Interface (MPI) is to provide a standard library of routines for writing portable and efficient message passing programs. MPI is not a language; it is a specification of a library of routines that can be called from programs. MPI provides a rich collection of point‐to‐point communication routines and collective operations for data movement, global computation, and synchronization. The MPI standard has evolved with the work around MPI‐2, which extended MPI to add more features including: dynamic processes, client‐server support, one‐sided communication, parallel I/O, and non‐blocking collective communication functions. In this chapter, we discuss a number of the important functions and programming techniques introduced so far. An MPI application can be visualized as a collection of concurrent communicating tasks. A program includes code written by the application programmer that is linked with a function library provided by the MPI software implementation. Each task is assigned a unique rank within a certain context: an integer number between 0 and n‐1 for an MPI application consisting of n tasks. These ranks are used by MPI tasks to identify each other in sending and receiving messages, to execute collective operations, and to cooperate in general. MPI tasks can run on the same processor or on different processors concurrently.
ISBN:9780471467403
0471467405
DOI:10.1002/0471478385.ch9