Showing posts with label Computer Organization. Show all posts
Showing posts with label Computer Organization. Show all posts

Monday, 8 June 2015

MultiProcessor Overview



Multiprocessors

Definition:
A Multiprocessor is an interconnection of two or more CPUs between memory and IO devices. The term ‘processor’ in multiprocessor can mean either a CPU or an input-output processor(IOP). However, a system with a single CPU and one or more IOPs is usually not included in the definition of a multiprocessor system unless the IOP has computational facilities comparable to a CPU. As it is most commonly defined, a multiprocessor system implies the existence of multiple CPUs, although usually there will be one or more IOPs as well.


         Bus-based multiprocessors

Flynn’s Classification of multiple-processor machines:
{SI, MI} x {SD, MD} = {SISD, SIMD, MISD, MIMD}
SISD = Single Instruction Single Data
            Classical Von Neumann machines.
SIMD = Single Instruction Multiple Data
             Also called Array Processors or
     Data Parallel machines.
MISD Does not exist.
MIMD Multiple Instruction Multiple Data
     Control parallelism.

A Taxonomy of Parallel Computers
                                
NUMA          Non Uniform Memory Access
COMA          Cache Only Memory Access
MPP              Massively Parallel Processor
COW             Cluster Of Workstations
CC-NUMA    Cache Coherent NUMA
NC-NUMA   No Cache NUMA
Characteristics of Multiprocessors
􀂋 Multiprocessors System = MIMD
􀁺 An interconnection of two or more CPUs with memory and I/O equipment
» a single CPU and one or more IOPs is usually not included in a multiprocessor system
􀂄 Unless the IOP has computational facilities comparable to a CPU
􀂋 Computation can proceed in parallel in one of two ways
􀁺 1) Multiple independent jobs can be made to operate in parallel
􀁺 2) A single job can be partitioned into multiple parallel tasks
􀂋 Classified by the memory Organization
􀁺 1) Shared memory or Tightly-coupled system
» Local memory + Shared memory
􀂄 higher degree of interaction between tasks
􀁺 2) Distribute memory or Loosely-coupled system
» Local memory + message passing scheme (packet or message )
􀂄 most efficient when the interaction between tasks is minimal