Synchronization Primitives Flashcards
What is concurrency, and provide some real-world examples?
○ Concurrency refers to the execution of multiple tasks or processes seemingly at the same time. While it might appear that tasks are happening simultaneously, they are often interleaved or divided among available processing units.
○ Examples: Millions of drivers on a highway at once, a student doing homework while watching Netflix, faculty having lunch while grading papers and watching Netflix
How has Moore’s Law influenced the shift towards concurrent programming?
○ While transistor count has continued to rise, CPU clock speed has plateaued.
○ This has led to a focus on multi-core CPUs. To fully utilize these cores, we need to write applications that can execute tasks concurrently
Explain the difference between a process and a thread. What are the advantages and disadvantages of each for concurrent programming?
○ Process: An executing instance of a program with its own address space, resources, and a single thread of control. Processes provide strong isolation—if one process crashes, others are unaffected. However, inter-process communication is typically slow and resource-intensive.
○ Thread: A lightweight unit of execution that shares the address space and resources of its parent process. Threads within a process can communicate quickly through shared memory, offering tighter integration for concurrent tasks. However, a crashing thread can bring down the entire process
What are the advantages and disadvantages of threads sharing data? What is a data race?
○ Advantages: Efficient communication and data access between threads.
○ Disadvantages: Prone to data races, where multiple threads try to access and modify shared data simultaneously.
○ Data Race: Occurs when the outcome of a program depends on the unpredictable interleaving of thread executions, potentially causing unexpected and incorrect results
Explain why a single line of code in a program might not be atomic. Use an example to illustrate.
○ Even simple operations, like a = a
+1;, often translate into multiple machine instructions:
1.Load the value of a from memory into a register.
2.Increment the value in the register.
3.Store the updated value back into memory.
○The scheduler can interrupt the execution of a thread between these instructions, allowing other threads to access and potentially modify shared data, leading to data races
Study the thread schedule examples in slides 10-21. Explain why the final value of ‘balance’ can differ based on the scheduling of the threads.
○ The thread schedules (slides 10-21) demonstrate that the final value of the shared variable ‘balance’ can be 101 (incorrect) or 102 (correct) depending on how the operating system schedules the threads.
○ This highlights the non-deterministic nature of concurrent programs; the output can vary based on the order in which threads are allowed to execute their instructions.
What is meant by non-determinism in the context of concurrent programming? What challenges does it present?
○ Non-determinism means that the output of a concurrent program can differ even with the same input, depending on the unpredictable timing of thread execution.
○ This makes debugging and ensuring correctness difficult because you can’t always reproduce the same execution path.
Define a critical section and explain the importance of mutual exclusion.
○ Critical Section: A section of code where shared resources (like shared memory) are accessed. It’s crucial to ensure that only one thread is executing within a critical section at any given time to prevent data races.
○ Mutual Exclusion: The property that guarantees that if one thread is executing inside a critical section, no other thread can enter that critical section at the same time. It’s fundamental to preventing concurrency bugs.
How do locks help achieve mutual exclusion in concurrent programs?
○ Locks are synchronization mechanisms that enforce mutual exclusion by allowing only one thread to hold the lock at a time.
○Before entering a critical section, a thread attempts to acquire the lock. If another thread holds the lock, the requesting thread blocks (waits) until the lock is released.
○This ensures that only one thread can access shared resources within the critical section, preventing data races.
What is the POSIX Threads (pthreads) library? What are some of its key functions?
○ The pthreads library provides a standard C/C++ API for thread management and synchronization.
○ Key functions:
■ pthread_create(): Create a new
thread.
■.pthread_exit(): Terminate the
calling thread.
■ pthread_join(): Wait for a specific
thread to finish executing.
■ pthread_mutex_lock(): Acquire a
lock (mutex).
■ pthread_mutex_unlock(): Release
a lock.
What is a deadlock? Describe a classic scenario that can lead to a deadlock.
○ Deadlock: A situation in concurrent programming where two or more threads are blocked indefinitely, each waiting for the other to release the resources that it needs.
○ Classic Example:
■ Thread A acquires lock 1 and then tries to acquire lock 2.
■ Thread B acquires lock 2 and then tries to acquire lock 1.
■ Both threads are now blocked, as each thread holds a lock the other one needs
Explain the purpose and use of condition variables in thread synchronization. When are they particularly useful?
○ Purpose: Condition variables are used to synchronize threads based on a particular condition becoming true.
○ Use Cases: Useful when a thread needs to wait for another thread to perform some action or change the state of shared data before it can proceed.
○ Example: A thread might use a condition variable to wait until a queue has data before trying to dequeue an item. This prevents the thread from busy-waiting.
Why must condition variables always be used in conjunction with a mutex?
○ Condition variables themselves do not provide mutual exclusion. Without a mutex, race conditions can arise when accessing the shared data or the condition variable itself.
○ The mutex ensures that the condition is checked and the wait operation (if needed) is performed atomically. This prevents situations where a thread might miss a signal or where multiple threads wake up spuriously
Explain the issue of spurious wake-ups with condition variables. How do you protect against them?
○ Spurious Wake-up: A thread waiting on a condition variable might sometimes wake up even if the condition hasn’t been signaled. This can happen due to various reasons, including signal delivery mechanisms or thread scheduling.
○ Protection: Always retest the condition in a loop after waking up from pthread_cond_wait(). This ensures that the thread only proceeds when the condition is genuinely met, preventing unexpected behavior
What is a semaphore? Describe its two primary operations.
○ Semaphore: A synchronization primitive that maintains a non-negative integer counter. It’s used to control access to a shared resource by multiple threads.
○ Operations:
■ sem_wait() (or wait): Attempts to decrement the counter. If the counter is 0, the thread blocks until it’s greater than 0.
■ sem_post() (or signal or post): Increments the counter. If any threads are blocked on the semaphore, one is unblocked, allowing it to decrement the counter and proceed.