Concurrency and Parallelism Flashcards
What is concurrency in computing?
Concurrency in computing refers to the ability of a system to execute multiple tasks or processes simultaneously making progress on more than one task during the same time period.
Define parallelism in the context of computing.
Parallelism is the simultaneous execution of multiple tasks or processes to achieve faster computation typically by utilizing multiple processors or cores.
Explain the difference between concurrency and parallelism.
Concurrency is about managing multiple tasks at the same time while parallelism is about executing multiple tasks simultaneously.
What is a thread?
A thread is the smallest unit of execution within a process. Multiple threads within a process can run concurrently sharing the same resources.
Define a race condition in the context of concurrent programming.
A race condition occurs when the behavior of a program depends on the relative timing of events and the outcome is unpredictable and depends on which thread arrives first.
Explain the concept of a critical section.
A critical section is a part of a program where shared resources are accessed and only one thread is allowed to execute at a time to avoid conflicts and maintain data consistency.
What is a deadlock in concurrent programming?
A deadlock is a situation where two or more threads are unable to proceed because each is waiting for the other to release a resource leading to a standstill.
Define parallel computing.
Parallel computing involves breaking down a problem into smaller sub-problems and solving them simultaneously often by distributing the tasks across multiple processors or computing nodes.
Explain the concept of a mutex (mutual exclusion).
A mutex is a synchronization mechanism used to control access to a shared resource by multiple threads ensuring that only one thread can access the resource at a time.
What is the difference between task parallelism and data parallelism?
Task parallelism involves executing different tasks concurrently while data parallelism involves distributing data across multiple processors and performing the same operation on each data element simultaneously.