Q. Define MPI. Explain in detail principles of Message Passing Programming. (MAY 19) 10M
Q. What are the principles of Message Passing Programming. (DEC 18, MAY 17) 10M
Q.What is Message Passing Interface? What are the principles of Message Passing Programming. (DEC 17) 10M
Ans- Message Passing Interface
Message Passing Interface (MPI) is a standardized and portable message-passing standard designed by a group of researchers from academia and industry to function on a wide variety of parallel computing architectures.
Principles of Message Passing Programming
- There are two key attributes that characterize the message-passing programming paradigm.
- the first is that it assumes a partitioned address space
- the second is that it supports only explicit parallelization.
- There are two immediate implications of a partitioned address space.
- First, each data element must belong to one of the partitions of the space; hence, data must be explicitly partitioned and placed. This adds complexity to programming, but encourages locality of access that is critical for achieving high performance on non-UMA architecture, since a processor can access its local data much faster than non-local data on such architectures.
- The second implication is that all interactions (read-only or read/write) require cooperation of two processes (the process that has the data and the process that wants to access the data). This requirement for cooperation adds a great deal of complexity for a number of reasons.
- The process that has the data must participate in the interaction even if it has no logical connection to the events at the requesting process. In certain circumstances, this requirement leads to unnatural programs. In particular, for dynamic and/or unstructured interactions the complexity of the code written for this type of paradigm can be very high for this reason.
- However, a primary advantage of explicit two-way interactions is that the programmer is fully aware of all the costs of non-local interactions, and is more likely to think about algorithms (and mappings) that minimize interactions.
- Another major advantage of this type of programming paradigm is that it can be efficiently implemented on a wide variety of architectures.
- The programmer is responsible for analyzing the underlying serial algorithm/application and identifying ways by which he or she can decompose the computations and extract concurrency.
- As a result, programming using the message-passing paradigm tends to be hard and intellectually demanding. However, on the other hand, properly written message-passing programs can often achieve very high performance and scale to a very large number of processes.
No comments:
Post a Comment