Memory and Cache Flashcards
What is Fully Associative Cache?
Fully Associative Cache can map a memory address to any block in the cache
What is Set Associative Cache?
Trade-off between direct mapped cache and the fully associative cache, divided into groups of blocks called sets, where each memory address is mapped to only one set (data can be placed anywhere within this set)
How does a Set Associative Cache differ from both Direct Mapped and Fully Associative Caches?
Set associative divides cache into sets, each address maps to one set but can be placed in any block within that set, balancing flexibility and efficiency
What are the trade-offs of using a Fully Associative Cache compared to Direct Mapping?
Fully associative allows any data to be stored anywhere, increasing flexibility at the cost of complexity and higher hardware requirements
What is temporal locality, and how does it influence what should be stored in the cache?
Temporal locality suggests that recently accessed data will likely be accessed again soon, such data should be kept in a cache
What is direct mapping in cache architecture?
A method where each memory location maps to exactly one location in the cache
Define the principle of locality in relation to cache performance
Locality refers to the tendency of a program to access a relatively small portion of its address space at any given time, typically temporal and spatial locality
What is the difference in access times and costs between Dynamic RAM (DRAM) and Flash superconductor memory?
DRAM access time: 50ns to 70ns; Cost: $10 to $20 per GB; Flash access time: 5,000ns to 50,000ns; Cost: $0.75 to $1 per GB
Explain how invalid blocks in a cache can be reused for new data
Invalid blocks can be filled with new data without affecting existing valid data, making efficient use of cache space
In what scenarios are larger block sizes beneficial for cache performance?
When accessing contiguous data items, larger blocks can exploit spatial locality, reducing the number of cache misses
What impact does miss penalty have on Average Memory Access Time (AMAT)?
Higher miss penalty increases AMAT, leading to slower overall performance as more time is spent fetching data from memory
What are the components that make up Average Memory Access Time (AMAT)? (Formula)
AMAT = hit time + (miss rate × miss penalty)
How do you place a new value in a cache?
By using mapping techniques such as direct mapping, fully associative, or set associative
What factors can be used to evaluate the performance of cache?
- Cache hit rate
- Miss rate, hit time
- Miss penalty
- Average Memory Access Time (AMAT)
What are the access times and costs associated with Static RAM (SRAM)?
Access time: 0.5ns to 2.5ns; Cost: $500 to $1000 per GB