Transactional Memory Flashcards

1
Q

What does the programmer decide with Transactional memory?

A

Programmer defines atomic code sections
Programmer only looks what operations should be atomic, nut not how ( the how is left to the system either hardware or software)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Benefits of Transactional Memory?

A
  • simpler and less error-prone code
  • higher-level semantics (what vs. how)
  • optimistic by design (does not require mutual exclusion)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Goal of Transactional Memory?

A

Remove the burden of synchronization from the programmer and place it in the system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

TM semantics: Atomicity

A

Other threads preserve either the initial or the final state, but not any intermediate states

states are made visible atomically

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

TM semantics: Isolation

A
  • while a transaction is running, effects from other transactions are not observed
  • as if the transaction takes a snapshot of the global state when it begins and then operates on that snapshot
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What happens in this example? see picture right code comitted before left code

A

It becomes clear when u serialize the executions of TXb and TXa see picture. Issues like this are handled by a Concurrency Control mechanism.

The transaction aborts, it than can be retried automatically or user can be informed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the difference between synchronous and asynchronous messages?

A

Synchronous: sender blocks until message is received.

Asynchronous: Sender does not block. Placed into buffer for receiver to get

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Message Passing Interface (MPI)

A

It is a portable and flexible library and is used as a standard API for parallel programming.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

MPI processes can be collected into groups. How are groups distinguished and how can they be combined to create something new?

A

Each group can have mutliple colors.

Group + color = communicator.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Transactional Memory is heavily inspired by database transactions with the ACID properties:

Give the 5 properties

A
  • Atomicity
  • Consistency (database remains in a consistent state)
  • Isolation (no mutual corruption of data)
  • Durability (e.g. transaction effects will survive power loss, not important in transactional memory)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

MPI also supports communications among groups of processors.

Give the operations and what they do. There are 5

A

Reduce: Takes all values from the single processes and puts the sum into process 0 or rank 0 (same thing)

Broadcast: Sends value in process 0 for example into all processes.

Allreduce: Sends the sum of all values in all processes to all processes. In the end every process has the sum of all values in the single processes.

Scatter: P0 has multiple values (say n values) –> they are split to n different processes

Gather: the opposite of Scatter. Puts all values back together into 1 process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Worst case of Heapsort & Mergesort?

A

O(n log n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Circle the comparisons which can be performed in parallel in the sorting network in the picture.

A

1) (3, 4), (1, 2)
2) (1, 3), (2, 4)
3) (2, 3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Does pointer tagging solve the ABA problem for all cases? Justify you answer.

A

No. Pointer-Tagging only makes the ABA problem much less probable. It uses a few bits of an address for a tag, and increments that tag each time the pointer is stored in a data structure. However the tag can overflow and revert to the initial value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In the lecture we saw two important concepts that form the basis of message-passing: Communicating Sequential Processes (CSP) and the ActorModel. What is the main difference between the two? Briefly describe a similarity MPI has with each of these concepts.

A

The Actor Model uses direct naming to adress other computational units, whereas CSP uses indirect naming (port/channels).

MPI is an implementation of message passing and uses ideas from both concepts. The communicators in MPI correspond to indirect naming in CSP. The idea of passing messages between computational units originated from the Actor Model first and is also present in MPI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly