Lecture 3 Flashcards
Why is a memory hierarchy necessary in computer systems?
To efficiently manage storage and retrieval of data using a combination of faster, smaller memories and larger, slower ones, optimizing cost and performance.
What are the three main levels of memory hierarchy and their characteristics?
Primary (fast, solid-state), Secondary (larger, slower, hard disks), Tertiary (tapes for backup).
Define cycle time, latency, and bandwidth in the context of memory systems.
Cycle time: Time from one read operation to the next.
Latency: Time from request to access. Bandwidth: Bits accessed per second.
What is the main idea behind cache memory?
To keep frequently used information close to the CPU in a small, fast memory to improve access times.
What happens during a cache hit and a cache miss?
Cache hit: Data is found in the cache.
Cache miss: Data must be retrieved from main memory, and potentially neighboring blocks.
Explain direct mapping in cache memory.
Each main memory block maps to a specific cache block based on a fixed relation, simplifying placement but potentially causing low cache hit ratios due to competition for blocks.
How does fully associative mapping differ from direct mapping?
An incoming block can be placed in any available cache block, providing flexibility but requiring more complex management.
Describe set-associative mapping.
Cache is divided into sets, and each main memory block maps to a specific set. This is a compromise between direct and fully associative mapping.
What is memory interleaving and its purpose?
A technique to increase bandwidth by distributing memory blocks across multiple memory modules, allowing faster block transfers.
What role does the cache mapping function play in memory management?
It translates processor addresses to determine if the requested data is in the cache and manages data placement and retrieval.