Explicit Parallelism

In computer programming, explicit parallelism is the representation of concurrent computations by means of primitives in the form of special-purpose directives or function calls. Most parallel primitives are related to process synchronization, communication or task partitioning. As they seldom contribute to actually carry out the intended computation of the program, their computational cost is often considered as parallelization overhead.

The advantage of explicit parallel programming is the absolute programmer control over the parallel execution. A skilled parallel programmer takes advantage of explicit parallelism to produce very efficient code. However, programming with explicit parallelism is often difficult, especially for non computing specialists, because of the extra work involved in planning the task division and synchronization of concurrent processes.

In some instances, explicit parallelism may be avoided with the use of an optimizing compiler that automatically extracts the parallelism inherent to computations (see implicit parallelism).

Read more about Explicit Parallelism:  Programming With Explicit Parallelism

Famous quotes containing the words explicit and/or parallelism:

    I think “taste” is a social concept and not an artistic one. I’m willing to show good taste, if I can, in somebody else’s living room, but our reading life is too short for a writer to be in any way polite. Since his words enter into another’s brain in silence and intimacy, he should be as honest and explicit as we are with ourselves.
    John Updike (b. 1932)

    The secret of heaven is kept from age to age. No imprudent, no sociable angel ever dropt an early syllable to answer the longings of saints, the fears of mortals. We should have listened on our knees to any favorite, who, by stricter obedience, had brought his thoughts into parallelism with the celestial currents, and could hint to human ears the scenery and circumstance of the newly parted soul.
    Ralph Waldo Emerson (1803–1882)