Task Parallelism - Description

Description

In a multiprocessor system, task parallelism is achieved when each processor executes a different thread (or process) on the same or different data. The threads may execute the same or different code. In the general case, different execution threads communicate with one another as they work. Communication usually takes place by passing data from one thread to the next as part of a workflow.

As a simple example, if we are running code on a 2-processor system (CPUs "a" & "b") in a parallel environment and we wish to do tasks "A" and "B", it is possible to tell CPU "a" to do task "A" and CPU "b" to do task 'B" simultaneously, thereby reducing the run time of the execution. The tasks can be assigned using conditional statements as described below.

Task parallelism emphasizes the distributed (parallelized) nature of the processing (i.e. threads), as opposed to the data (data parallelism). Most real programs fall somewhere on a continuum between task parallelism and data parallelismEmpty citation‎ (help) .

Read more about this topic:  Task Parallelism

Famous quotes containing the word description:

    An intentional object is given by a word or a phrase which gives a description under which.
    Gertrude Elizabeth Margaret Anscombe (b. 1919)

    Do not require a description of the countries towards which you sail. The description does not describe them to you, and to- morrow you arrive there, and know them by inhabiting them.
    Ralph Waldo Emerson (1803–1882)

    The next Augustan age will dawn on the other side of the Atlantic. There will, perhaps, be a Thucydides at Boston, a Xenophon at New York, and, in time, a Virgil at Mexico, and a Newton at Peru. At last, some curious traveller from Lima will visit England and give a description of the ruins of St. Paul’s, like the editions of Balbec and Palmyra.
    Horace Walpole (1717–1797)