Amdahl's Law

Amdahl's law, also known as Amdahl's argument, is named after computer architect Gene Amdahl, and is used to find the maximum expected improvement to an overall system when only part of the system is improved. It is often used in parallel computing to predict the theoretical maximum speedup using multiple processors. It was presented at the AFIPS Spring Joint Computer Conference in 1967.

The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program. For example, if a program needs 20 hours using a single processor core, and a particular portion of 1 hour cannot be parallelized, while the remaining promising portion of 19 hours (95%) can be parallelized, then regardless of how many processors we devote to a parallelized execution of this program, the minimum execution time cannot be less than that critical 1 hour. Hence the speedup is limited up to 20×, as the diagram illustrates.

Read more about Amdahl's Law:  Description, Parallelization, Relation To Law of Diminishing Returns, Speedup in A Sequential Program, Relation To Gustafson's Law

Famous quotes containing the word law:

    Unless we maintain correctional institutions of such character that they create respect for law and government instead of breeding resentment and a desire for revenge, we are meeting lawlessness with stupidity and making a travesty of justice.
    Mary B. Harris (1874–1957)