Design Evaluation Flashcards
Why evaluate design?
- Improve usability
- Reduce flaws after launch
What part of the design is being evaluated?
- All levels, parts, and attributes
Where is the design being evaluated
- Laboratory (for controlled scenarios and internal validity)
- Natural Setting (for realistic scenarios and external validity)
When to evaluate design?
- Anytime
- Formative process (before formation — qualitative)
- Summative process (after formation — quantitative)
What are evaluation issues?
- Surroundings (Environment must be considered in the evaluation_
- Bias (Undiversified sample group)
- Hawthorne Effect (Reactivity due to awareness of being observed)
What is Analytical Evaluation?
- Evaluation that doesn’t involve users
- Predict user behaviour and identify usability problems
- Discover design issues before usability testing
What is Heuristic Evaluation?
- Fast usability engineering
- Simple prototypes
- 3-5 Users (testers)
What steps are in Heuristic Evaluation?
- Evaluators evaluate the interface multiple times — compare components with usability principles
- Evaluators aggregate their findings
- Reveals usability problems
What is Usability Testing?
- Identify problems in a product’s design and learn about user behaviour and preference
- Measure performance and satisfaction with a system
- Laboratory setting
What components are in Usability Testing?
- Participants
- Tasks
- Facilitator
What steps are in Usability Testing?
- Prepare tasks, find participants, and set up test materials
- Invite participants, and observe and ask questions
- Give scenario verbally and textually
- Give tasks one at a time
- Give short breaks if tasks are long
- Take notes and collect data (audio and video)
- Debrief participants after tasks
- Analyze data, find problems, summarize results, and make recommendations
How do you effectively collect data?
- Think-Aloud Protocol
- During testing — participants talk about how they feel during tasks
- After testing — participants recall how they feel after a task
- Audio and video recording
- Questionnaires
- Start with general demographic information
- Use open-ended questions
What type of questions are there?
- Likert Scale
- Semantic Scale
- Ranking Questions
- Open-ended Questions
What is the UEQ?
- 26 pairs of contrasting attributes to be answered
- Semantic scale questions
- Comes with data analysis tools
What is the UEQ-S?
- Shorter version of UEQ
- 8 pairs instead of 26
- Aggregates into 3 scales:
- Pragmatic quality — practical, goal directed
- Hedonic quality — appeal, non-goal directed
- Overall
What is qualitative data/analysis?
Observational-based
- Facilitator takes notes and recordings
- Participants write/voices comments
What is Quantitative Data/Analysis?
Assessment-based
- Measures from participant tasks and questionnaires
What should you look out for during Qualitative Analysis?
- Frequently occurring problems
- Workarounds
- Enjoyment
What do you do with results from Qualitative Analysis?
- Prioritize problems based on severity and use usability goals to redesign
- Highlight the good results
- Provide recommendations, not rules (consider constraints such as branding and standards)