Amdahl's Law

Amdahl's law, also known as Amdahl's argument, is named after computer architect Gene Amdahl, and is used to find the maximum expected improvement to an overall system when only part of the system is improved. It is often used in parallel computing to predict the theoretical maximum speedup using multiple processors. It was presented at the AFIPS Spring Joint Computer Conference in 1967.

The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program. For example, if a program needs 20 hours using a single processor core, and a particular portion of 1 hour cannot be parallelized, while the remaining promising portion of 19 hours (95%) can be parallelized, then regardless of how many processors we devote to a parallelized execution of this program, the minimum execution time cannot be less than that critical 1 hour. Hence the speedup is limited up to 20×, as the diagram illustrates.

Read more about Amdahl's Law:  Description, Parallelization, Relation To Law of Diminishing Returns, Speedup in A Sequential Program, Relation To Gustafson's Law

Famous quotes containing the word law:

    I wish my countrymen to consider that whatever the human law may be, neither an individual nor a nation can ever commit the least act of injustice against the obscurest individual without having to pay the penalty for it. A government which deliberately enacts injustice, and persists in it, will at length even become the laughing-stock of the world.
    Henry David Thoreau (1817–1862)