Parallel Processing - Some History


Parallel Processing - the concurrent or simultaneous execution of two or more parts of a single computer program, at speeds far exceeding those of a conventional computer. Parallel processing requires two or more interconnected processors, each of which executes a portion of the task; some supercomputer parallel-processing systems have hundreds of thousands of microprocessors. The processors access data through shared memory. The efficiency of parallel processing is dependent upon the development of programming languages that optimize the division of the tasks among the processors.
 
The Columbia Electronic Encyclopedia, Sixth Edition
Columbia University Press.

Examples of Time Dependent Computations

  1. Scientific Computations:
       If a scientific compution takes 10 hours to complete then a scientist might only to be able do 1 experiment per day. If it can be done in an hour then it may be possible to do serveral in a day.

  2. Weather Forcasting:
       If a forecast takes too long to compute, it may be of no value

  3. Web Clients:
      Web applications should be user-friendly, each user should feel that they have exclusive access to a Web resource. The problem being that typing takes place in human-time.

  4. Algorithmic Stock Trading:
       If a trade decision takes 1 micro-second to make and another firm can do it in 1/2 a micro-second then you might lose the opportunity.