Block 4 Flashcards

1
Q

Why evaluate

A

Business reputation, Financial success, Being relevant in complex, real world, keeping up with tech

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Business reputation

A

Negative feedback can damage company

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Financial success

A

Appropriate evaluation carried out at right points in interaction process is a prudent investment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Being relevant in complex, real world

A

By testing with real people in real contexts, you can pick up on cultural aspects that you might not have anticipated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Keeping up with tech

A

Evaluating with users allows designers to experiment with prototypes of novel models of interaction, and gain fresh insights into emerging interaction paradigms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Iterative design

A

Need at least 15 users to discover all usability problems in design

Better to distribute evaluations over smaller groups

If you have funding for 15 users spend this on 3 studies with 5 users in each

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Evaluation methods

A

User testing - Investigation of whether there is a need for the design

Usability testing - Evaluation of whether the system is usable by the intended users

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Choosing and combining methods

A

Many methods available in evaluation toolkit

Typically finding a fault requires a combination of methods

Not all evaluations need to be detailed

Opportunistic evaluation is done informally and gets quick feedback. This is done early in the design process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Evaluating for accessibility

A

Some instances in which experts may be used to provide info that cannot be ascertained from users

E.g. visual impairment, aurally challenged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

New methods

A

New methods and variations on existing methods constantly being added to evaluation repertoire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Ethical issues and informed consent

A

Becoming increasingly sensitive area, due to potential to gather and disseminate large amounts of personal data quickly over internet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Adapting interviews and instructions

A

Need to consider whether any instructions given to users need to be adapted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Adapting time allowances

A

Users with disabilities often require more time to complete an evaluation than users without disabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Adapting physical arrangements

A

Need to consider whether the setting needs to be adapted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

When a helper, interpreter or advocate may be needed

A

Times when helper or user advocate needed to work alongside the participant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Evaluating with children

A

Increasingly, children included in design and evaluation of interactive products, and each evaluation must be adapted to them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Informed consent

A

About protecting rights of all parties involved in an evaluation by providing them with all the info they need to decide whether or not to participate

Need agreement of participant to take part in evaluation, need their permission to record evaluation, need their permission to use and store data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

UK data protection act

A

controls how personal info used by organisation, businesses or government

Everyone responsible for using data has to follow strict rues called ‘data protection principles’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Any data given must -

A

Be used fairly and lawfully
used for limited, specifically stated purposes
used in a way that is adequate, relevant and not excessive
Accurate
Kept for no longer than absolutely necessary
handled according to people’s data protections rights
Kept safe and secure
Not transferred outside European economic area without adequate protection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

From data to info

A

Any evaluation will produce data of some kind

Whatever nature of data is, purpose of analysing and interpreting data is to transform it into info relevant and useful to design process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Making sense of data

A

Analysis, interpretation, presentation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Analysis

A

Follow three steps -

  1. Collating the data - Gathering all data collected and organising it for processing
  2. Analysing and summarising data - Extracting patterns or other observations from collated data. These patterns are first step of making sense of data
  3. Reviewing the data - Accessing whether usability and user experience goals for interactive product have been met
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Interpretation

A

Data interpretation in evaluation is the process of actively considering what caused the problems that have been identified, and what to do about them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Three steps to interpretation

A
  1. Finding causes for usability problems that have been identified during analysis and rating seriousness of each
  2. Prioritising issues
  3. Proposing changes to design or making other recommendations to address these problems
25
Q

Presentation

A

Best way of presenting data needs to be decided

Through simple presentations often sufficient in iterative design process, when evaluations inform design as it progresses, sometimes stronger evidence needed

26
Q

Working with Quantitative data

A

Three groups of methods for summarising data -

  1. Tabulations, charts and rankings
  2. Descriptive statistics
  3. Inferential statistics
27
Q

Tabulations, charts and rankings

A

Provide visual representation of your data

28
Q

Descriptive statistics

A

Such as mean, median and mode that describe data you have obtained

29
Q

Inferential statistics

A

Tests of statistical significance that give the probability that a claim arising from your data can be applied to your user population as a whole

30
Q

Analysing and interpreting quantitative data

A

if you include descriptive statistics in your report, consider whether they are generalisable

31
Q

Representing quantitative data visually

A

Important to think about, and choose, most appropriate data display for your results, and for whom your presenting your data

32
Q

Working with qualitative data

A

Three simple forms of qualitative analysis

  1. identifying recurring patterns or themes
  2. Categorising data
  3. Analysing critical incidents
33
Q

Repeatability

A

Test of rigorous qualitative analysis. Can patterns or categories that emerge be articulated clearly enough for independent evaluator to apply with same result

34
Q

Interpreting outcomes of data analysis

A

Outcome of data analysis should be list of usability problems found during evaluation

35
Q

Usability defect

A
Irritates/confuses user
Makes system hard to learn/install/use
Causes mental overload of user
Causes poor user performance
Violates design standards on guidelines
Reduces trust in the system
Tends to cause repeat errors
Could make product hard to market
36
Q

Identifying causes of usability defects

A

Need to look in greater depth at various sources of evaluation data you have collected

37
Q

Prioritising issues

A

You can assign severity ratings

Can prioritise the list by assigning severity ratings to each defect

38
Q

Proposing changes, making recommendations

A

Recommendations likely to contain several points -

Successes to build upon
Defects to fix
Possible defects or successes not proven

39
Q

Presenting findings

A

Key is to provide useful, well-founded info concisely and accurately, in a form appropriate to audience

40
Q

Planning an evaluation involving users

A
Determine goals and questions
Choose approach and methods
Plan data collection
Address practical issues
Consider any ethical issues
Plan how to analyse, interpret and present data
Assemble materials needed to support evaluation
Pilot studies
41
Q

Determine goals and questions

A

Before evaluation be clear why you are doing it.

What questions need to be answered?

What sort of info do you need to answer question?

42
Q

Choose approach and methods

A

Choose appropriate evaluation approaches and methods to answer specific questions identified

43
Q

Plan data collection

A

What data for you need to answer evaluation questions?

Is it feasible to collect that data?

Usability goals typically addressed by examining user behaviour, which can be captured qualitatively or quantitatively

44
Q

Address practical issues

A

Who, What , Where ,how (constraints)

45
Q

Users (who)

A

What is profile of intended user?

How would you characterise them?

46
Q

Task (what)

A

What tasks, and how many, do you want to evaluate?

Why have you chosen these tasks?

47
Q

Different task you could include

A
  • Core tasks frequently performed by user
  • Tasks that have some new design features or functionality added
  • Critical tasks, even though they may not be frequently used
  • Task you feel has to be validated with users for greater clarity and understanding of the design team
48
Q

Setting (where)

A

Where will evaluation take place?

How will you set up and lay out the session?

49
Q

Equipment (How)

A

Do you need any equipment?

50
Q

Constraints

A

Are there any practical constraints?

51
Q

Consider any ethical issues

A

What are ethical considerations of the evaluation? (Task, users, location, data collection)

52
Q

Plan how to analyse, interpret and present data

A

In order to collect data effectively, you must understand what you mean to do with it, and how you mean to analyse it

Always plan how you will analyse and interpret data before you conduct and evaluation

53
Q

Assemble materials needed to support evaluation

A
  • Evaluation script - Helps to have detailed script of everything you will say to participants
  • Introductory and background info for participant
  • Informed consent
  • A task description
  • Data collection forms
  • Post session interview plan or questionnaire
  • Analysis plan
54
Q

Pilot studies

A

Often appropriate to run pilot studies to ensure choices made are practicable

Pilot study is a small-scale trial to check evaluation

55
Q

Evaluator/observer bias

A

Participants behaviour can be altered just by observer watching them

56
Q

Evaluator/observer bias

A

Participants behaviour can be altered just by observer watching them

57
Q

Methodology (bias)

A

Way evaluation designed and conducted can introduce bias

Should counterbalance the order in which users do tasks

58
Q

Reporting and analysis

A

Analysis undertaken and results reported will be influenced to some extent by choice of evaluator

59
Q

Cognitive walkthrough

A
  • Will user know what to do?
  • Will users see how to do it?
  • Will users understand from the feedback whether action was correct or not?