Usability Evaluation Flashcards

1
Q

Usability Evaluation

A

A process to gather information about a system’s usability to improve or assess it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

3 goals to evaluating

A
  1. Functionality: Suited to user tasks?
  2. User Experience: Learnability, satisfaction
  3. Problem-Spotting: Pinpoint design flaws
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Heuristic evaluation (Nielsen)

A

Experts check against 10 usability heuristics and list the issues

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cognitive Walkthrough

A

Simulates user actions for learnability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

User Testing (Empirical)

A
  • Think-Aloud: Users verbalise thoughts (but may disrupt task flow)
  • Observation/Logging: Watch real use (Hawthorne effect risk)
  • A/B Testing: Compare alternative (e.g. button colours)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Key Usability Heuristics (Nielsen)

A
  1. Visibility of Status: Progress feedback (e.g., loading bars).
  2. Match Real World: Familiar terms/icons (e.g., trash bin = delete).
  3. User Control: Undo, cancel, quit options.
  4. Consistency: Uniform buttons, labels.
  5. Error Prevention: Confirmations (e.g., “Delete forever?”).
  6. Recognition > Recall: Show options (no memorization).
  7. Flexibility: Shortcuts for experts.
  8. Minimalist Design: Avoid clutter.
  9. Help Users Recover: Clear error messages.
  10. Help/Docs: Accessible guidance.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Severity Maxtix

A

Prioritise issues by frequency (how often users encounter it) and impact (how disruptive it is)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Cognitive Walkthrough Steps

A
  1. Goal: Will users know to do?
  2. Action Visibility: Is the button obvious?
  3. Mapping: Do buttons do what they mean?
  4. Feedback: Does UI confirm success?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Common Evaluation Pitfalls

A
  • Testing on Yourself
  • Wrong Users: Non-representative samples
  • Too Late: Fixing post-launch is costly
  • Ignoring Context: Lab ≠ real-world use
How well did you know this?
1
Not at all
2
3
4
5
Perfectly