MEMORY II Flashcards
What are the basic design elements used to classify cache architectures
cache size;
mapping function; replacement algorithm; write policy;
line size;
number of caches.
Describe multilevel caches
Due to increased logic density, it has become possible to have a cache on the same chip as the processor. This increases execution time as less activity over an external bus is needed. Even though an on-chip cache exists, it is typically desirable to have an off-chip cache as well. This means that if a miss occurs on the level 1 cache (on-chip), instead of retrieving the data from the slower main memory, information may be retrieved from the level 2 cache, which, although slower than level 1 cache, is still appreciably faster than main memory. Some level 2 caches are stored on-chip and a level 3 cache has been implemented off-chip
Describe unified/split caches
Two types of words exist that are stored in cache, namely data and instruction. It has become common to split the cache into two to separate these words.
What are the advantages of a split and unified cache
Two potential advantages to a unified cache are:
- A greater hit rate than split caches because the load between instruction and data fetches are balanced automatically.
- Only one cache needs to be designed and implemented.
The key advantage of the split cache design is that it eliminates contention for the cache between the instruction fetch/decode unit and the execution unit. This is important for designs that rely on pipelining of instructions.
STATE and EXPLAIN any TWO variations of RAM
Dynamic RAM
Static RAM
What does it mean to say that that the memory is 133MHz and the processor is 1.4 GHZ?
The speed of the processor is 1, 400,000,00 cycles per second. Saying a memory is 133Mhz means that the memory can be accessed at 133,000,000 cycles per second
Why do we have two separate cache memories
(Instruction/Code-cache and Data-cache) in a processor?
The split design enables us to place the instruction cache close to the instruction fetch unit and the data cache close to the memory unit, thereby simultaneously reducing the latencies of both.
What are three differences between caches and virtual memory?
Objective- Cache memory increase CPU access speed while virtual memory increase main memory capacity.
Management- CPU and related hardware manages cache memory while Operating System manages virtual memory.
Operation- Cache memory keeps recently used data while virtual memory keeps the programs which are not getting accommodated in main memory.
In virtual memory, mapping frameworks is needed for mapping virtual address to physical address while in cache memory, no such mapping frameworks is needed.
LIST & EXPLAIN any THREE similarities between L1 & L2 cache
XC