Next: The Building Blocks: Send
Up: Principles of Message-Passing Programming
Previous: Principles of Message-Passing Programming
Contents
Structure of Message-Passing Programs
- Message-passing programs are often written using the asynchronous or loosely synchronous paradigms.
- In the asynchronous paradigm, all concurrent tasks execute asynchronously. This makes it possible to implement any parallel algorithm. However, such programs can be harder to reason about, and can have non-deterministic behavior due to race conditions.
- Loosely synchronous programs are a good compromise between these two extremes. In such programs, tasks or subsets of tasks synchronize to perform interactions. However, between these interactions, tasks execute completely asynchronously. Since the interaction happens synchronously, it is still quite easy to reason about the program. Many of the known parallel algorithms can be naturally implemented using loosely synchronous programs.
- In its most general form, the message-passing paradigm supports execution of a different program on each of the
processes. This provides the ultimate flexibility in parallel programming, but makes the job of writing parallel programs effectively unscalable.
- For this reason, most message-passing programs are written using the single program multiple data (SPMD) approach. In SPMD programs the code executed by different processes is identical except for a small number of processes (e.g., the "root" process).
- This does not mean that the processes work in lock-step. In an extreme case, even in an SPMD program, each process could execute a different code (the program contains a large case statement with code for each process). But except for this degenerate case, most processes execute the same code. SPMD programs can be loosely synchronous or completely asynchronous.
Next: The Building Blocks: Send
Up: Principles of Message-Passing Programming
Previous: Principles of Message-Passing Programming
Contents
Cem Ozdogan
2006-12-27