Gustafson's Law - Derivation of Gustafson's Law

Derivation of Gustafson's Law

The execution time of the program on a parallel computer is decomposed into:

where is the sequential time and is the parallel time, on any of the P processors. (Overhead is ignored.)

The key assumption of Gustafson and Barsis is that the total amount of work to be done in parallel varies linearly with the number of processors. Then practical implication is of the single processor being more capable than the single processing assignment to be executed in parallel with (typically similar) other assignments. This implies that b, the per-process parallel time, should be held fixed as P is varied. The corresponding time for sequential processing is

Speedup is accordingly:

Defining to be the sequential fraction of the parallel execution time, we have

Thus, if is small, the speedup is approximately P, as desired. It may even be the case that diminishes as P (together with the problem size) increases; if that holds true, then S approaches P monotoneous with the growth of P.

Thus Gustafson's law seems to rescue parallel processing from Amdahl's law. It is based on the idea that if the problem size is allowed to grow monotonically with P, then the sequential fraction of the workload would not ultimately come to dominate. This is enabled by means of having most of the assignments individually containable within a single processor's scope of processing; thus a single processor may provide for multiple assignments, while a single assignment shouldn't span more than a single processor. This is also the rule for relating projects with work-sites, having multiple projects per site, but only one site per project.

Read more about this topic:  Gustafson's Law

Famous quotes containing the word law:

    But what is classification but the perceiving that these objects are not chaotic, and are not foreign, but have a law which is also the law of the human mind?
    Ralph Waldo Emerson (1803–1882)