30. Performance and Benchmarking Flashcards

1
Q

Why do we care about OS system performance?

A

People prefer a mostly-correct but extremely fast system to a completely correct and slow system

Think about it: Would crashes bother you at all if your system rebooted instantaneously?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the four step process of improving OS system performance?

A
  1. Measure your system (after deciding how)
  2. Analyze the results (statistics)
  3. Improve the slow parts (after deciding which slow parts matter most)
  4. Drink celebratory beer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why is measuring system performance very challenging?

A

High-level software counters may not have fine enough resolution to measure extremely fast events

Low-level hardware counters often have extremely device-specific interfaces, making measuring hard

Measurements aren’t always repeatable because the system we’re trying to measure uses past events in an attempt to predict the future, so the system is almost never in the exact same state it was in the last time it was measured

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the three ways that measuring a real system might affect the measurements you’re trying to take?

A
  1. Measurements may destroy the problem you are trying to measure
  2. Must separate results from the noise produced by measurement
  3. Measurement overhead may limit your access to real systems

Ex: Vendor: “No way am I running your instrumented binary. Your software is slow enough already!”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Describe building a model as a mean to measuring a real system’s performance.

A

Abstract away all of the low-level details and reason analytically

(Think: equations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Describe building a simulator as a mean to measuring a real system’s performance.

A

Write some additional code to performa simplified simulation of more complex parts of the system - particularly hardware

(Think: code)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the PROS and CONS of models to measure system performance?

A

Pro: Can make strong mathematical guarantees about system performance

Con: These guarantees usually come after making a bunch of unrealistic assumptions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the PROS and CONS of simulations to measure system performance?

A

Pro: In the best case, experimental speedup outweighs the lack of hardware details

Con: In the worst case, bugs in the simulator lead you in all sorts of wrong directions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a microbenchmark?

A

Isolating one aspect of system performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a macrobenchmark?

A

Measuring one operation involving many parts of the system working together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is an application benchmark?

A

Focusing on the performance of the system as observed by one application.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the three general rules for effectively measuring system performance?

A
  1. Have a goal in mind more specific than “I want to make this blob of code faster.” This helps choose measurement techniques and benchmarks
  2. Validate your models and simulator before you start changing things
    - Do their results match your intuition? If not, something is wrong
    - Do their results match reality? If not, something is really wrong
  3. Use modeling, simulation, and real experiments as appropriate
    - If you can’t convince yourself analytically that a new approach is an improvement, don’t bother simulating
    - If your simulator doesn’t show improvement, don’t bother implementing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly