Amdahl's Law

Amdahl's law, also known as Amdahl's argument, is named after computer architect Gene Amdahl, and is used to find the maximum expected improvement to an overall system when only part of the system is improved. It is often used in parallel computing to predict the theoretical maximum speedup using multiple processors. It was presented at the AFIPS Spring Joint Computer Conference in 1967.

The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program. For example, if a program needs 20 hours using a single processor core, and a particular portion of 1 hour cannot be parallelized, while the remaining promising portion of 19 hours (95%) can be parallelized, then regardless of how many processors we devote to a parallelized execution of this program, the minimum execution time cannot be less than that critical 1 hour. Hence the speedup is limited up to 20×, as the diagram illustrates.

Read more about Amdahl's Law:  Description, Parallelization, Relation To Law of Diminishing Returns, Speedup in A Sequential Program, Relation To Gustafson's Law

Famous quotes containing the word law:

    The law is a great thing,—because men are poor and weak, and bad. And it is great, because where it exists in its strength, no tyrant can be above it. But between you and me there should be no mention of law as the guide of conduct. Speak to me of honour, and of duty, and of nobility; and tell me what they require of you.
    Anthony Trollope (1815–1882)