Definitions Flashcards
synchronization
the coordination of two or more tasks to get the desired results
Control synchronization
When, for example, one task depends on the end of another task, the second task can’t start before the first has finished
Data access synchronization
When two or more tasks have access to a shared variable and only one of the tasks can access the variable at any given time
granularity
the number of tasks, which can be performed independently without intercommunication in your parallel algorithm
coarse-grained granularity
(big tasks with low intercommunication), the overhead due to synchronization will be low. However, maybe you won’t benefit all the cores of your system
fine-grained granularity
(small tasks with high intercommunication), the overhead due to synchronization will be high and maybe the throughput of your algorithm won’t be good.
Semaphore
A semaphore is a mechanism that can be used to control the access to one or more units of a resource. It has a variable that stores the number of resources that can be used and two atomic operations to manage the value of the variable
mutex
A mutex (short for mutual exclusion) is a special kind of semaphore that can take only two values (resource is free and resource is busy), and only the process that sets the mutex to busy can release it.
Monitor
A monitor is a mechanism to get mutual exclusion over a shared resource. It has a mutex, a condition variable, and two operations to wait for the condition and to signal the condition. Once you signal the condition, only one of the tasks that are waiting for it continues with its execution.
Mutual exclusion
A concept closely related to synchronization is critical section. A critical section is a piece of code that can be only executed by a task at any given time because of its access to a shared resource. Mutual exclusion is the mechanism used to guarantee this requirement and can be implemented by different ways.
thread safety
A piece of code (or a method or an object) is thread-safe if all the users of shared data are protected by synchronization mechanisms, a nonblocking compare-and-swap (CAS) primitive or data is immutable, so you can use that code in a concurrent application without any problem.
atomic operation
a kind of operation that appears to occur instantaneously to the rest of the tasks of the program
atomic variable
a kind of variable with atomic operations to set and get its value. You can implement an atomic variable using a synchronization mechanism or in a lock-free manner using CAS, which doesn’t need any synchronization.
shared memory
normally it is used when the tasks are running in the same computer. The tasks use the same memory area where they write and read values. To avoid problems, the access to this shared memory has to be in a critical section protected by a synchronization mechanism.
message passing
normally is used when the tasks are running in different computers. When a task needs to communicate with another, it sends a message that follows a predefined protocol. This communication can be synchronous if the sender is blocked waiting for a response or asynchronous if the sender continues with their execution after sending the message.