Synchronisation Flashcards
Why is a while-loop used around pthread_cond_wait() in a multi-threaded consumer scenario?
Mutex Release and Block: Consumer threads block on a semaphore (signalc). The mutex is released during this block, allowing other threads to access the shared buffer.
Wake Up and Mutex Capture: Upon waking (all consumer threads wake up simultaneously), only one thread can capture the mutex and process the buffer.
Rechecking Buffer State: After the mutex is released, another thread acquires it. The buffer state might have changed, so the condition (buffer emptiness) must be rechecked. A while-loop ensures this re-evaluation, maintaining proper synchronization among consumer threads sharing a buffer.
What is “Priority Inversion” and how does it affect multi-threaded systems?
Definition: Priority Inversion occurs when a lower-priority thread holds a resource needed by a higher-priority thread, leading to an inversion of the intended priority scheme.
Typical Scenario: Involves three threads – a high, a medium, and a low-priority thread. The low-priority thread acquires a lock on a shared resource.
Blocking of High-Priority Thread: The high-priority thread is blocked because it needs the resource locked by the low-priority thread.
Interruption by Medium-Priority Thread: A medium-priority thread, not needing the resource, preempts the low-priority thread, preventing it from releasing the resource.
System Impact: The higher-priority thread is effectively blocked by a lower-priority one, leading to performance issues, especially in real-time systems.
Mitigation: Techniques like priority inheritance, where the low-priority thread inherits the higher priority of the thread it blocks, are used to address priority inversion.
What is a race condition?
A race condition is when two or more threads dont coordinate when modifying shared data.
What is “Deadlock Prevention” in a multi-threaded environment?
Definition: Deadlock Prevention involves designing the system to eliminate the possibility of a deadlock.
Approach: Ensures that at least one of the necessary conditions for a deadlock (mutual exclusion, hold and wait, no preemption, circular wait) cannot occur.
Examples:
Requiring a thread to request all resources at once.
Using non-blocking locks like try-lock instead of blocking locks.
Goal: Structuring the system to make deadlocks structurally impossible.
What is “Deadlock Avoidance” in a multi-threaded environment?
Definition: Deadlock Avoidance involves making dynamic resource allocation decisions to ensure the system never enters an unsafe state where a deadlock is possible.
Method: The system tracks current resource allocations and future requests, analyzing if the next allocation could potentially lead to a deadlock.
Strategy: Only allow resource allocations that lead to safe states. Safe states are those where there is a sequence of allocations that can satisfy all process needs without a deadlock.
Example: Banker’s Algorithm, which assesses the safety of a state before allocating resources.