Distributed Computing - Introduction - Parallel and Distributed Computing

Parallel and Distributed Computing

Distributed systems are groups of networked computers, which have the same goal for their work. The terms "concurrent computing", "parallel computing", and "distributed computing" have a lot of overlap, and no clear distinction exists between them. The same system may be characterised both as "parallel" and "distributed"; the processors in a typical distributed system run concurrently in parallel. Parallel computing may be seen as a particular tightly coupled form of distributed computing, and distributed computing may be seen as a loosely coupled form of parallel computing. Nevertheless, it is possible to roughly classify concurrent systems as "parallel" or "distributed" using the following criteria:

  • In parallel computing, all processors may have access to a shared memory to exchange information between processors.
  • In distributed computing, each processor has its own private memory (distributed memory). Information is exchanged by passing messages between the processors.

The figure on the right illustrates the difference between distributed and parallel systems. Figure (a) is a schematic view of a typical distributed system; as usual, the system is represented as a network topology in which each node is a computer and each line connecting the nodes is a communication link. Figure (b) shows the same distributed system in more detail: each computer has its own local memory, and information can be exchanged only by passing messages from one node to another by using the available communication links. Figure (c) shows a parallel system in which each processor has a direct access to a shared memory.

The situation is further complicated by the traditional uses of the terms parallel and distributed algorithm that do not quite match the above definitions of parallel and distributed systems; see the section Theoretical foundations below for more detailed discussion. Nevertheless, as a rule of thumb, high-performance parallel computation in a shared-memory multiprocessor uses parallel algorithms while the coordination of a large-scale distributed system uses distributed algorithms.

Read more about this topic:  Distributed Computing, Introduction

Famous quotes containing the words parallel and, parallel and/or distributed:

    As I look at the human story I see two stories. They run parallel and never meet. One is of people who live, as they can or must, the events that arrive; the other is of people who live, as they intend, the events they create.
    Margaret Anderson (1886–1973)

    There isn’t a Parallel of Latitude but thinks it would have been the Equator if it had had its rights.
    Mark Twain [Samuel Langhorne Clemens] (1835–1910)

    Taking food alone tends to make one hard and coarse. Those accustomed to it must lead a Spartan life if they are not to go downhill. Hermits have observed, if for only this reason, a frugal diet. For it is only in company that eating is done justice; food must be divided and distributed if it is to be well received.
    Walter Benjamin (1892–1940)