Concurrency Flashcards

1
Q

Thread

A

A thread is execution along a code path of Java statements that are performed sequentially.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Runnable

A

Thread supports execution of tasks that are implementions of the java.lang.Runnable interface.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Thread on operating system level

A

On the operating system level, the thread has both an instruction and a stack pointer. The instruction pointer references the next instruction to be processed, and the stack pointer references a private memory area—not available to other threads—where thread-local data is stored. Thread local data is typically variable literals that are defined in the Java methods of the application.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Scheduler

A

For the user to perceive that applications can run in parallel, the CPU has to share its processing time between the application threads. The sharing of a CPU’s processing time is handled by a scheduler.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Thread priority

A

The scheduling strategy can be implemented in various ways, but it is mainly based on the thread priority: a high-priority thread gets the CPU allocation before a low-priority thread, which gives more execution time to high-priority threads. Thread priority in Java can be set between 1 (lowest) and 10 (highest), but—unless explicitly set—the normal priority is 5.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Thread starvation

A

Thread starvation is when the low-priority threads are not given enough processing time carry out the job they was intended for. Hence, schedulers also take the processing time of the threads into account when changing to a new thread.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Context switch

A

A thread change is known as context switch. A context switch starts by storing the state of the executing thread so that the execution can be resumed at a later point, whereafter that thread has to wait. The scheduler then restores another waiting thread for processing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

True concurrency

A

If the number of executing threads exceeds the number of processors, true concurrency can not be achieved, but the scheduler switches rapidly between threads to be processed so that every code path is split into execution intervals that are processed in a sequence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Blocked threads

A

Blocked threads are suspended while they wait for the monitor to be released by another thread.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Executing thread

A

Executing thread is the one and only thread that owns the monitor and is currently running the code in the critical section.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Waiting thread

A

Waiting threads are threads that have voluntarily given up ownership of the monitor before it has reached the end of the critical section. The threads are waiting to be signalled before they can take ownership again. No FIFO guaranteed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Synchronized keyword

A

The synchronized keyword can operate in different intrinsic locks. Keep in mind that synchronization on static methods operates on the intrinsic lock of the class object and not the instance object.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

ReentrantLock

A

ReentrantLock is similar to the synchronized keyword, but provides more functionality. It can be set FIFO using fairness property, for example. It also provides tryLock method, to check if lock is taken. ReentrantLock provides a method called lockInterruptibly(), which can be used to interrupt thread when it is waiting for lock.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

ReentrantReadWriteLock

A

ReentrantReadWriteLock allows to lock reading and writing independently. It has an overhead on checking if thread can have access to the locked section, so it is only advisable to be used in cases when many threads are doing reading while few threads are doing writing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Threads disadvantage

A

Threads disadvantage is that they consume additional memory and have overhead when starting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Concurrent design best practices

A
  • Favoring reuse of threads instead of always creating new threads, so that the frequency of creation and teardown of resources can be reduced.
    • Not using more threads than required. The more threads that are used, the more memory and processor time is consumed. Also context switching overhead.