Zybook: Ch 1.2 - Eight great ideas in computer architecture Flashcards
Moore’s Law
The one constant for computer designers is rapid change. As computer designs can take years, the resources available per chip can easily double or quadruple between the start and finish of the project.
e.g. A soccer player runs not to where the ball is, but to where the ball will be.
Looking ahead is common, such as a married couple buying a house large enough for future kids.
Abstractions
to characterize the design at different levels of representation; lower-level details are hidden to offer a simpler model at higher levels.
e.g. A house architect first designs a house with 5 rooms, then designs room details like closets, windows, and flooring.
A “room” is a higher level abstraction important during early design. Later, the architect thinks of lower-levels details.
Common case
Making the common case fast will tend to enhance performance better than optimizing the rare case. Ironically, the common case is often simpler than the rare case and hence is usually easier to enhance.
e.g. A college student rents an apartment closer to campus than to her favorite weekend beach spot.
The student goes to campus more frequently than the beach, so she optimizes her commute for the common case.
Parallel Performance
computer architects have offered designs that get more performance by computing operations in parallel.
e.g. A sister is hanging clothes to dry. Her brother helps by hanging clothes simultaneously.
In this case, two people working in parallel can halve the task’s time.
Pipelining
moves multiple operations through hardware units that each do a piece of an operation, akin to water flowing through a pipeline.
e.g. A brother is washing and drying dishes. His sister helps by drying each dish immediately after the brother washes each.
Dividing a task into pieces in a way of improving performance. If pieces are equal sizes, task time may be halved.
Performance via prediction
in some cases it can be faster on average to guess and start working rather than wait until you know for sure, assuming that the mechanism to recover from a misprediction is not too expensive and your prediction is relatively accurate.
e.g. A mom expects her son will be hungry after a long airplane flight, so she cooks dinner just in case. If he’s not hungry, she’ll whip up a dessert instead.
By predicting he’ll be hungry, she’s able to finish the job (end his hunger) faster than if she waited for him to get home.
Dependability via Redundancy
Computers not only need to be fast; they need to be dependable. Since any physical device can fail, we make systems dependable by including redundant components that can take over when a failure occurs and to help detect failures.
e.g. A drummer’s stick breaks, but he quickly grabs another one and continues playing the song.
Having extras/backups is a good idea in many scenarios.
Hierarchy of memories
caches give the programmer the illusion that main memory is almost as fast as the top of the hierarchy and nearly as big and cheap as the bottom of the hierarchy. We use a layered triangle icon to represent the memory hierarchy. The shape indicates speed, cost, and size: the closer to the top, the faster and more expensive per bit the memory; the wider the base of the layer, the bigger the memory.