UE EVALUATION:Quickfire usability evaluation Flashcards
Technical software evaluation (testing
Technical software evaluations focus on the software and noton the interaction with the user
Formal technical reviews
include others besides the develope
Technical Software testing
*White-box (basis path testing, control structure testing)
*Black-box testing (focus on functional requirements)
Software testing strategies
*Unit Testing
*Integration Testing
*Validation Testing
*System Testing (Recovery testing, Security testing, Stress testing, Performance testing
What is evaluation
Evaluation: a process through which informationabout the usability of a system is gathered in order to improve the system or to assess a completed interface
3 Goals for evaluation
3 Goals for evaluation
1. Assess system’s functionality
*matching user to task -> appropriate function
2. Assess effect of interface on user
*how easy to learn
*usability
*matches users expectations
3. Identify specific problems with the system
*specifically with features of the design
*may include contextual features
4 key considerations for selecting evaluation approaches
To determine the type and form of evaluative study to carry out, what are the:
1.characteristics of users
2.types of activities users do
3.environment of study
4.nature of artefact being evaluated
Need for formal status: COMMON MISTAKES
*‘common sense’
*testing on yourself
*speak to end users
*wrong users
*used too late to change
Where in the cycle?
Formative: used to check decisions made and assist future decisions, e.g., which alternative should be adopted?- e.g. problems with the design?
*iterative
*summative testing: done at the END of the design process- eg. does it work as intended?
Usability Evaluation - three main forms
analytical testing
1. expert reviewbased on expert’s opinions
*can be done at different stages of design
2. abstract testing
*draw from existing data
*e.g. cognitive analysis/ GOMS/ KLM
*empirical
3. user or usability testing
*experimental or qualitative studies of use
expert review: ‘quickfire’ approaches
*heuristic evaluation
*several passes; ‘scorecard’
approach
*assessment on conformance to
10 design ‘rules’
*consistency inspection
*terminology, colour, layout,
inputs, outputs(also - training
materials and help
systems)
*cognitive walkthrough
*similar to software engineering
‘code walkthrough’, but stepping
through user actions on the
interfaceto simulate users
doing the task
*esp. for freq. tasks, but also
rare, but critical tasks
*formal usability inspection
*adversarial courtroom style
meeting with moderator
*present & discuss
weaknesses/strengths
Analytic method – Heuristic Evaluation (Nielsen)
*Experts independently evaluate the user interface
*whether it conforms to established usability principles (heuristics)
*‘Discount usability engineering’ (quick and dirty)
*Evaluation should last 1 or 2 hours
*Method can be applied to paper prototypes-> Output of the evaluation
*List of usability problems in the interface
*Referring to usability principles that were violated
Analytic method - heuristic evaluation which problems are the same, which are not?
Best practice - each expert reports a problem in this format:
1.Problem description: a brief description of the problem
2.Likely/actual difficulties: the anticipated difficulties that the user will encounter as a consequence of the problem
3.Specific contexts: the specific context in which the problem may occur
4.Assumed causes: description of the cause(s) of the problem
User control and freedom
Strategies:
Cancel button (for dialogs waiting for user input)
Universal Undo (can get back to previous state)
Interrupt (especially for lengthy operations)
Quit (for leaving the program at any time)
Defaults (for restoring a property sheet/ start again