Week 5 & 6 - CPU & Cache Flashcards
``
What is CPU?
The central processing unit (CPU, or processor) is the part of the computer that executes program instructions on program data.
What is the ALU?
The ALU is a complex circuit that implements all arithmetic and logic operations on signed and unsigned integers.
The ALU takes integer operand values and an opcode value that specifies the operation to perform (e.g., addition). The ALU outputs the resulting value of performing the specified operation on the operand inputs and condition code values that encode information about the result of the operation.
Types of memory?
primary (stores data, programs, instructions) and secondary (external storage devices)
Primary is RAM (SRAM + DRAM) and ROM (PROM+EPROM+EEPROM)
Why is memory important?
- Processing of large data (more memory, bigger capability for processing large data)
- Operating Systems (lack of memory may increase chance of crashing or erroneous functioning)
- Virtualization
- Overall performance (more memory in the system provides better performance)
- Browsing
- Multitasking (running many applications simultaneously)
What does the line decoder do?
selects only one of the output lines to set to 1 for each input bit pattern
When Enable = 0, all the output lines are 0.
● When Enable = 1, the three-bit number at the input, x = x2x1x0, selects which output line is set to 1.
Input lines X Output lines
^ format of representing line decoders ^
How is storage spaace in a direct-mapped cache divided?
In cache lines.
Contains 2 types of information: cache data block and metadata.
Which part of the address identifies what line we are referring to?
index
What does the offset determine?
cache’s block size dimension, number of words
The offset portion of an address must contain enough bits to refer to every possible byte within a cache data block.
What does the number of index bits determine?
number of cache lines (or blocks)
Which type of caches suffer the most from conflicts?
direct-mapped, set associative, fully associative?
direct-mapped
For example, even if a direct-mapped cache is not 100% full, a program might end up with the addresses of two frequently used variables mapping to the same cache location. In such cases, each access to one of those variables evicts the other from the cache as they compete for the same cache line.
What’s a fully associative cache?
A fully associative cache allows any memory region to occupy any cache location. Fully associative caches offer the most flexibility, but they also have the highest lookup and eviction complexity because every location needs to be simultaneously considered during any operation.
What is set associative cache?
A set associative design offers a good compromise between complexity and conflicts. The number of lines in a set limits how many places a cache needs to check during a lookup, and multiple memory regions that map to the same set don’t trigger conflict misses unless the entire set fills.
In a set associative cache, the index portion of a memory address maps the address to one set of cache lines. When performing an address lookup, the cache simultaneously checks every line in the set.
Advantages and disadvantages of fully associative cache?
Pro: Flexible, data can be put anywhere (no conflicts)
Con: Have to search everywhere for data (slow/expensive)
Pros and cons of set-assocative cache?
Pro: reasonably flexible: data can be put anywhere in a set (reduced conflicts)
Con: have to search several places for data (reasonable speed/complexity)
Pros and cons of direct-mapped cache?
Pro: only have to search one location (fast/simple)
Con: Data can only be put in one location (conflicts)