Task Parallelism - Description

Description

In a multiprocessor system, task parallelism is achieved when each processor executes a different thread (or process) on the same or different data. The threads may execute the same or different code. In the general case, different execution threads communicate with one another as they work. Communication usually takes place by passing data from one thread to the next as part of a workflow.

As a simple example, if we are running code on a 2-processor system (CPUs "a" & "b") in a parallel environment and we wish to do tasks "A" and "B", it is possible to tell CPU "a" to do task "A" and CPU "b" to do task 'B" simultaneously, thereby reducing the run time of the execution. The tasks can be assigned using conditional statements as described below.

Task parallelism emphasizes the distributed (parallelized) nature of the processing (i.e. threads), as opposed to the data (data parallelism). Most real programs fall somewhere on a continuum between task parallelism and data parallelismEmpty citation‎ (help) .

Read more about this topic:  Task Parallelism

Famous quotes containing the word description:

    It is possible—indeed possible even according to the old conception of logic—to give in advance a description of all ‘true’ logical propositions. Hence there can never be surprises in logic.
    Ludwig Wittgenstein (1889–1951)

    The type of fig leaf which each culture employs to cover its social taboos offers a twofold description of its morality. It reveals that certain unacknowledged behavior exists and it suggests the form that such behavior takes.
    Freda Adler (b. 1934)

    The next Augustan age will dawn on the other side of the Atlantic. There will, perhaps, be a Thucydides at Boston, a Xenophon at New York, and, in time, a Virgil at Mexico, and a Newton at Peru. At last, some curious traveller from Lima will visit England and give a description of the ruins of St. Paul’s, like the editions of Balbec and Palmyra.
    Horace Walpole (1717–1797)