Parallel and Distributed Computing Flashcards
Distributed Computing System example
the cs lab, supercomputers, email, file servers, printer access
Parallel Computing system example
Stampede here on campus
Distributed System
A set of physically separate processors connected by one or more communication links
Parallel Computing
-Tightly-coupled systems-processors share clock, memory, and run one OS-frequent communication-processors connected via a network (typically a bus)-network is connected to single shared memory
Distributed Computing
-loosely-coupled systems-Each processor has its own memory-each processor runs an independent OS-communication is very expensive-collection of computers (nodes) connected by a network
Parallel computing communicates through:
shared memory (typically)-read and write accesses to shared memory locations
Two forms of parallel computing architecture:
-SMP (Symmetric Multiprocessor)-Multicore-can also combine the two (ex. Stampede)
Distributed computing communicates through:
message passing
The architecture of distributed computing
-Nodes are connected by a network-Massively Parallel Machines
SMP (Symmetric Multiprocessor)
-Multiprocessor: two or more processors have a common RAM-Symmetric refers to the OS (one OS for all processors and any processor can run it)
Multicore:
multiprocessors on the same chip
Massively Parallel Machines
-nodes are greatly simplified and include only memory, processor(s), network card-augmented with fast network interface
Clusters
-Networks workstations with a fast-built of Common off-the-shelf (COTS) parts-less specialized
Very Distributed Computing
-grid computing-cloud computing
Parallel programming involves:
-decomposing an algorithm into parts-Distributing the parts as tasks which re worked on by multiple processors simultaneously-Coordinating work and communication of those processors (synchronization)