MODULE 1 Flashcards
Types of parallel computing:
- Bit-level parallelism
- Instruction-level parallelism
- Task parallelism
Bit-level parallelism:
Tasks are dependent on processor word size
Instruction-level parallelism:
How many instructions are implemented in a clock cycle
Instruction-level parallelism:
How many instructions are implemented in a clock cycle
Task parallelism:
Tasks are decomposed into subtasks
Distributed systems issues:
- Lack of global knowledge
- Naming
- Scalability
- Compatibility
- Reliability/Fault tolerance
Applications of parallel computing:
- Databases
- Real-time simulation
- Multimedia
- Engineering
- Collaboration
- Augmented reality
Lack of global knowledge issues:
- Process synchronization
- Resource management
Dichotomy of parallel computing platforms:
An explicit parallel program must specify control structure and communication model
Distributed system problems:
- Distributed consensus
- Caching consistency
- Trust
PRAM subclasses:
- Exclusive-Read, Exclusive-Write (EREW)
- Concurrent-Read, Exclusive-Write (CREW)
- Exclusive-Read, Concurrent-Write (ERCW)
- Concurrent-Read, Concurrent-Write (CRCW)
SIMD:
A single control unit dispatches same instruction to processors that work on different data
Distributed memory parallel system:
Parallel systems that don’t share memory
MIMD:
Each processor has its own control unit and can execute different data items
Shared memory parallel system:
Multiple processors can access the same memory simultaneously