Concurrent Computing

Concurrent computing is a form of computing in which programs are designed as collections of interacting computational processes that may be executed in parallel. Concurrent programs (processes or threads) can be executed on a single processor by interleaving the execution steps of each in a time-slicing way, or can be executed in parallel by assigning each computational process to one of a set of processors that may be close or distributed across a network. The main challenges in designing concurrent programs are ensuring the correct sequencing of the interactions or communications between different computational executions, and coordinating access to resources that are shared among executions. A number of different methods can be used to implement concurrent programs, such as implementing each computational execution as an operating system process, or implementing the computational processes as a set of threads within a single operating system process.

Pioneers in the field of concurrent computing include Edsger Dijkstra, Per Brinch Hansen, and C.A.R. Hoare.

Read more about Concurrent Computing:  Concurrent Interaction and Communication, Coordinating Access To Resources, Advantages, Concurrent Programming Languages, Models of Concurrency

Famous quotes containing the word concurrent:

    I have been too long acquainted with human nature to have great regard for human testimony; and a very great degree of probability, supported by various concurrent circumstances, conspiring in one point, will have much greater weight with me, than human testimony upon oath, or even upon honour; both of which I have frequently seen considerably warped by private views.
    Philip Dormer Stanhope, 4th Earl Chesterfield (1694–1773)