Concurrency and threads Flashcards

1
Q

What is the difference between concurrency and parallelism in operating systems?

A

Concurrency is the task of running and managing multiple computations over overlapping time periods. It doesn’t necessarily mean that they’ll run at the same instant. In a single CPU, it gives the illusion of simultaneous tasks with rapid context switching.

Parallelism involves performing multiple computations simultaneously, often on different physical processors or cores. It requires actual simultaneous execution, ideal for multi-core or multi-CPU systems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do threads enable concurrency within a single processor system?

A
  • Threads enable concurrency within a single processor system by allowing the CPU to switch rapidly between multiple threads within the same process.
  • This switching, managed by the scheduler, allocates small time slices to each thread, giving the illusion that all threads are running simultaneously.
  • Since threads share the same process context, including memory and resources, this switching is more efficient than process switching, enhancing system responsiveness and performance in concurrent tasks.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a deadlock in a multi-threaded environment, and what are the four necessary conditions for it to occur?

A

Deadlock Definition: A deadlock is a situation in multi-threaded environments where two or more threads are unable to proceed with their execution because each is waiting for resources held by the others, creating an infinite loop of waiting.

Four Necessary Conditions (Coffman Conditions):
1. Mutual Exclusion: At least one resource must be non-shareable, causing other threads to wait until the resource is released.
2. Hold and Wait: Threads are holding resources while waiting to acquire more that are currently held by other threads.
3. No Preemption: Resources cannot be forcibly removed from the threads holding them; they must be released voluntarily.
4. Circular Wait: There is a circular chain of threads, with each thread waiting for a resource held by the next thread in the chain.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are semaphores in a multi-threaded environment, and how do they differ from mutexes?

A

Semaphores Definition: Semaphores are synchronization tools used in multi-threaded environments to control access to resources. They can be binary (like mutexes) or counting semaphores, which allow multiple units to be accessed concurrently.

Types of Semaphores:
Binary Semaphore: Similar to a mutex, used for mutual exclusion of a single shared resource. It has two states - locked or unlocked.

Counting Semaphore: Allows access to a specified number of identical resources. It has a value greater than 1, representing the number of available resources.

Functionality: Semaphores use two main operations: ‘wait’ (or ‘P’) to acquire a resource (decrement the semaphore) and ‘signal’ (or ‘V’) to release a resource (increment the semaphore).

Difference from Mutexes:
Mutexes: Specifically for mutual exclusion and can only be unlocked by the thread that locked it.
Semaphores: More flexible, suitable for both mutual exclusion and other complex synchronization patterns. Can be modified by any thread, not just the one that performed the initial operation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is thread starvation in a multi-threaded environment?

A

Thread Starvation: This occurs when a thread is perpetually denied access to resources or processor time because other threads are monopolizing these resources.

Common Cause: Often arises in priority-based scheduling systems where lower priority threads are continually preempted by higher priority threads.

Implication: Leads to delayed or completely hindered execution of certain threads, reducing the program’s efficiency and potentially causing parts of the program to never run.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is thread livelock in a multi-threaded environment?

A

Thread Livelock: A situation where threads are actively changing their state or decisions in response to the state of other threads without making any real progress.

Cause: Typically occurs in algorithms designed to prevent deadlock, where threads try to avoid locking by continuously retrying an operation, but end up in a cycle where none of them can proceed.

Implication: Similar to deadlock in that no progress is made, but the system remains active, leading to resource wastage without achieving any useful work.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the purpose of a barrier in multi-threaded programming, and how does it differ from mutexes and semaphores?

A

Barrier Purpose: A barrier is a synchronization mechanism used to align multiple threads at a specific point in the program. It ensures that all threads stop at the barrier and only proceed when all have reached this point.

Functionality: Barriers are used in scenarios where it is necessary for multiple threads to wait for each other before continuing their execution, ensuring that all threads complete one phase before moving to the next.

Difference from Mutexes and Semaphores:
Mutexes: Designed for mutual exclusion in accessing shared resources; only one thread can access the resource at a time.
Semaphores: Used for controlling resource access (binary semaphores) and signaling between threads (counting semaphores).
Barriers: Focus on synchronizing the execution timing of threads, not on resource access. They ensure that all threads wait at a specific execution point before any of them can continue.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a ‘Thread Pool’ in multi-threaded programming, and what are its advantages?

A

Thread Pool Definition: A thread pool is a collection of pre-created threads that are available to perform tasks. Instead of creating new threads for each task, tasks are delegated to an available thread from this pool.

Advantages:
Efficient Resource Management: Reduces the overhead of creating and destroying threads for each task, conserving system resources.
Improved Performance: Pre-existing threads eliminate the time delay in thread creation, leading to faster task commencement.
Control Over Concurrent Execution: Limits the number of active threads, preventing excessive resource consumption and potential system instability.
Task Queue Management: Enables effective handling of incoming tasks through a managed queue, facilitating task prioritization and orderly processing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Thread Local Storage (TLS) in multi-threaded programming, and how does it differ from global and local variables?

A

Thread Local Storage (TLS)
A programming construct providing each thread with its own unique static storage area.
TLS variables are unique to each thread, allowing threads to have different values for the same variable.
Difference from Global and Local Variables:

Global Variables: Shared across all threads, located in a fixed memory location (data segment), with changes visible to all threads.
Local Variables: Specific to a function and stored on the stack, unique to each function call and inherently thread-safe due to separate stacks for each thread.
TLS Variables: Offer a ‘global-like’ scope on a per-thread basis, avoiding concurrency issues of global variables while providing thread-specific persistent data, unlike local variables.
Use Case:
Ideal for data that needs to be thread-specific yet persistent throughout the thread’s life, like thread-specific error codes or temporary buffers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Readers-Writers problem in multi-threaded programming?

A

Definition: A classic synchronization challenge involving a shared resource that can be concurrently read by multiple threads (readers) and written by others (writers).
Key Challenge: Balancing the need for simultaneous access by multiple readers while ensuring exclusive access for writers to maintain data consistency.
Constraints: No reader should read the resource while a writer is writing, and writers need exclusive access to prevent concurrent read/write operations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are typical solutions to address the Readers-Writers problem?

A

Readers Priority: Prioritizes reader access. Writers wait until no readers are active. Effective for read-intensive operations but can lead to writer starvation.
Writers Priority: Gives precedence to writers. Readers can access only if no writers are waiting. Prevents writer delays but can cause reader starvation.
Fair (FIFO) Approach: Queues both readers and writers to balance access. Aims for fairness but can be less efficient due to increased context switching.
Synchronization Tools: Utilizes semaphores, mutexes, and condition variables to manage and synchronize access to the shared resource.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly