L5 - Memory Hierarchy Flashcards

1
Q

Write Through

A

Update memory immediately. Short and simple execution but higher energy consumption due to higher number of writes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Spatial Locality

A

Spatial Locality describes the concept that data which gets accessed frequently is arranged close together in the cache.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Direct Mapping

A

Each element has its particular cache line. DM is simple to implement but does not utilise cache resources efficiently due to frequent use of same high miss rate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does Temporal Locality describe?

A

Temporal Locality describes the concept that data which gets accessed likely and more often will get accessed soon.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Memory Hierarchy

A

high speed: Cache (SRAM) < Main Memory (DRAM) < Virtual Memory (SSD/HDD) : capacity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does SDRAM stand for?

A

SDRAM or Synchronous Dynamic Random Access Memory provides large data storage at low cost and gets accessed when the cache misses. Data is stored in banks. PRO: large data storage at low cost and small area; advanced safety applications; fast CON: difficult for worst-case timing analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

LRU

A

Least Recently Used. If all cache lines are valid; the one with that was least recently used (highest LRU) gets overwritten.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fully Associative with Round Robin/FIFO

A

The initially empty cache gets filled in a Round Robin manner.

  • PRO: simple and less chip are, a mimics LRU (better performance than direct mapped)
  • CON: slightly more complicated to predict, slightly less performance than LRU
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fully Associative with LRU (Least Recently Used)

A

In the initially empty cache each memory address can be associated to any cache line.

  • PRO: conflict misses are avoided, leads to a higher utilization and good performance for most of the applications.
  • CON: most complex architecture, huge size (LRU) due to counters and comparators.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Set Associative Cache

A

Hybrid: fully associative + direct mapping. Cache lines are grouped in sets and data can be stored in any cache line within the set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define: Miss Penalty.

A

memory access time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Write Back

A

Write to cache and write to memory when replaced (dirty line). Less number of writes (energy efficient) and short execution time but difficult to implement (comparators needed).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly