Multithreading2 Flashcards
[1] What is Concurrency?
[2] What is Multithreading?
[3] What is the difference between Concurrency and Multithreadin?
[1] CONCURRENCY (Wspolbieznosc)
Concurrency is when two or more tasks can start, run, and complete in overlapping time periods. It doesn’t necessarily mean they’ll ever both be running at the same instant. For example, multitasking on a single-core machine.
[2] Multithreading (Wielowatkowosc)
Parallelism is when tasks literally run at the same time, e.g., on a multicore processor.
[3]
What is “Critical section”?
A critical section
is a piece of code that can be only executed by one task at a time because of its
access to a shared resource. Mutual exclusion is the mechanism used to
guarantee this requirement and can be implemented in different ways
What are the different mechanisms to get synchronization in a concurrent system?
[1] Semaphore
Semaphore: A semaphore is a mechanism that can be used to control the
access to one or more units of a resource. It has a variable that stores the
number of resources that can be used and two atomic operations to manage
the value of the variable. A mutex (short for mutual exclusion) is a special
kind of semaphore that can take only two values (resource is free and
resource is busy), and only the process that sets the mutex to busy can
release it. A mutex can help you to avoid race conditions by protecting a
critical section.
[2] Monitor
: A monitor is a mechanism to get mutual exclusion over a shared
resource. It has a mutex, a condition variable, and two operations to wait for
the condition and signal the condition. Once you signal the condition, only
one of the tasks that are waiting for it continues with its execution.
What is a race condition?
You can have a data race (also named race condition) in your application when
you have two or more tasks writing a shared variable outside a critical section,
that’s to say, without using any synchronization mechanisms.
Under these circumstances, the final result of your application may depend on
the order or execution of the tasks.
package com.packt.java.concurrency;
public class Account {
private float balance;
public void modify (float difference) {
float value=this.balance;
this.balance=value+difference;
}
}
Imagine that two different tasks execute the modify() method in the same Account
object. Depending on the order of execution of the sentences in the tasks, the
final result can vary. Suppose that the initial balance is 1000 and the two tasks
call the modify() method with 1000 as a parameter. The final result should be
3000, but if both tasks execute the first sentence at the same time and then the
second sentence at the same time, the final result will be 2000. As you can see,
the modify() method is not atomic and the Account class is not thread-safe.
What is deadlock?
There is a deadlock in your concurrent application when there are two or more
tasks waiting for a shared resource that must be free from another thread that is
waiting for another shared resource that must be free by one of the first ones. It
happens when four conditions happen simultaneously in the system.
What is a “Livelock”?
A livelock occurs when you have two tasks in your system that are always
changing their states due to the actions of the other. Consequently, they are in a
loop of state changes and unable to continue.
For example, you have two tasks - Task 1 and Task 2, and both need two
resources - Resource 1 and Resource 2. Suppose that Task 1 has a lock on
Resource 1, and Task 2 has a lock on Resource 2. As they are unable to gain
access to the resource they need, they free their resources and begin the cycle
again. This situation can continue indefinitely, so the tasks will never end their
execution