HCI Examples Topics Flashcards

1
Q

Perceptual Task -
Discrimination

A

Telling whether a difference occurs in sensory stimulation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Perceptual Task -
Detection

A

Telling whether an event of interest occurs, or not, in the environment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Perceptual Task -
Recognition

A

Categorizing a stimulus as something.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Perceptual Task -
Estimation

A

Estimating a property of an object or event in the environment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Perceptual Task -
Search

A

Localizing an object of interest.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sensory modalities -
Vision

A
  • Fast
  • High bandwidth for parallel processing
  • Field of view of 180 degrees
  • Can interfere with primary task
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Sensory modalities -
Hearing

A
  • Very fast
  • Field of hearing of 360 degrees
  • Serial presentation
  • Ineffective in a noisy environment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Sensory modalities -
Tactition

A
  • Fast
  • Limited to areas of physical contact
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fitts’ Law -
Considerations

A

Validity and implication of results highly dependent on biological and task constraints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Design objectives

A
  • Efficiency
  • Learnability
  • Usability
  • Consistency
  • Accessibility
  • Explorability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Design objectives -
Efficiency

A

The speed-accuracy trade-off: performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Design objectives -
Accessibility

A

Equivalent levels of usability across user groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Design objectives -
Usability

A

Qualities of the user interface that allow users to achieve their goals effectively, efficiently and enjoyably

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Design objectives -
Learnability

A
  • Easy to learn
  • Time to become proficient
  • How to allow optimal performance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Two-axis model of collaborative technology

A
  • Synchronous / Asynchronous
  • Co-located / Remote
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Reality-based Interaction

A

A framework that provides aims for building interactive technology that better supports and exploits our capabilities (using skills and awareness)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Reality-based Interaction -
Aims

A
  • Naïve physics
  • Body-awareness and skills
  • Environment awareness and skills
  • Social awareness and skills
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Collaboration

A

Collaboration emphasizes a joint construction of shared goals and ways of doing the work.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Cooperation

A

Cooperation implies division of labor between parties, where each party is responsible for a different aspect of problem solving.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Dimensions of coordination

A
  • Articulation work
  • Awareness
  • Boundary objects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Coordination factors -
Articulation work

A

Describes activities extraneous to the work itself.

Important to work in a way that is situationally more appropriate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Coordinate factors -
Awareness

A

Collaborator’s ability to follow what others are doing, how their subtasks are progressing, and what they attend to.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Coordination factors -
Boundary objects

A

Objects that are shared among collaborators to help them coordinate or share information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

System boundary

A

Anything within the system boundary will be mapped out and anything outside the boundary is out-of-scope.

The boundary should encapsulate everything necessary for the system to operate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Types of automation

A
  • Acquisition
  • Analysis
  • Action
  • Decision
  • Adaptive
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Types of automation -
Action

A

The machine is partially or fully executing an action choice.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Types of automation -
Acquisition

A

System sensing and registering of input data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Types of automation -
Analysis

A

Automation of information analysis.

i.e. extrapolation or prediction of data or integrating multiple sources of input data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Types of automation -
Decision

A

Deciding and selecting appropriate actions among decision alternatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Types of automation -
Action

A

The machine is partially or fully executing an action choice.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Types of automation -
Context

A

The type and level of automation is allowed to vary depending on context.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Automation levels (1-10) -
Computer control

A
  • 10: decides everything
  • 9: informs human only if it decides
  • 8: informs human only if asked
  • 7: executes automatically and informs the human
  • 6: allows restricted time before automatic execution
  • 5: acts automatically if user approves
  • 4: selects one alternative action
  • 3: narrows selection to a few
  • 2: offers complete set of alternatives
  • 1: offers no assistance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

User-centric evaluation criteria for automation

A
  • Can increase or decrease mental workload
  • Can affect situational awareness
  • Can cause complacency due to overconfidence or excess trust
  • Can cause skill degradation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Mixed-initiative interface principles

A
  • Developing significant value added automation
  • Considering uncertainties in a user’s goals
  • Considering timings in the status of a user’s attention
  • Infer ideal action in light of costs, benefits, and uncertainties
  • Employing dialogue to resolve key uncertainties
  • Allow safe and efficient termination
  • Minimize cost of poor guesses and timing
  • Mechanisms for efficient agent-user collaboration
  • Continued learning through observation
  • Maintain working memory of recent interactions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Risk assessment methods

A
  • SWIFT
  • FMEA
  • Fault Tree
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Structured What-If Technique (SWIFT)

A

A team based risk assessment method that prompts teams to ask what-if questions to stimulate thinking about possible risks and hazards in a system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

SWIFT Approach

A
  • Based on a vocabulary which serves as prompts.
  • Words used as facilitators to discuss possible scenarios
  • Focus on deviations like “failure to detect”, “wrong message / time / delay”
  • Columns: identifier - what-if question - risk/hazard - relevant control - risk ranking - action notes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Failure mode and effects analysis (FMEA)

A

Used to analyse human error at both the individual and team level.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

FMEA Approach

A
  • Identifier
  • Component
  • Failure mode
  • Causes
  • Probability
  • Severity
  • Risk
  • Recovery
  • Action notes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Fault Tree

A
  • Diagrammatic method for identifying and analyzing factors to a fault - unintended behaviour.
  • Created by starting with the fault as the top-level event and then progressively analyzing factors contributing to the fault.
  • Highlights interrelationships between components in the system and users.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Needs according to SDT

A
  • Autonomy: the sense that actions are performed willingly
  • Competence: the feeling of achieving mastery and controlling the outcomes of actions
  • Relatedness: the sense of reciprocal belonging in relation to other humans
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Types of human error

A
  • Mistakes due to formulation of an incorrect intention.
  • Slips due to failure to carry out the action correctly.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Skills, Rules, and Knowledge (SRK) Model

A

A framework for understanding the performance of skilled users.

44
Q

SRK Model -
Skill-based behaviour

A

Behaviour has high automaticity and happens without conscious control.

45
Q

SRK Model -
Rule-based behaviour

A

Behavior characterized by the user employing stored procedures or rules of the type ”if X then Y”: such rules can be learned or acquired by experience.

46
Q

SRK Model -
Knowledge-based behaviour

A

The user is faced with unfamiliar situations where the user has not developed any rules or knowledge for how to control the system.

User explicitly formulates a goal and develops a plan to achieve this goal, evaluated through trial-and-error or consideration of consequences.

47
Q

Risk management

A
  • Hazard identification
  • Risk estimation
  • Risk evaluation
  • Risk control
  • Risk monitoring
48
Q

Research strategy principles

A
  • Bounded: choice of research method bounds empirical results
  • Trade-offs: there are often trade-offs to consider between essential criteria, i.e. realism, precision and generalizability
  • Triangulation: using multiple research methods to observe the same phenomena
49
Q

Research strategy principles -
Bounded

A

No fully accurate method of user research, and therefore the goals of the system need to be clarified before user research can take place.

50
Q

Research strategy principles -
Trade-offs

A
  • Realism concerns how similar the situation being studied is to the situations that the researcher wants to gather insights about.
  • Precision involves how much detailed, accurate information is possible to collect, and how the lack of this would affect user research.
  • Generalizability concerns how well findings generalize to other people or situations.
51
Q

Methodological Qualities

A
  • Validity
  • Reliability
  • Transparency
  • Ethics
52
Q

Methodological Qualities -
Validity

A

Concerns whether conclusions obtained from user research are warranted/accurate.

53
Q

Methodological Qualities -
Reliability

A

Concerns whether user research results are consistent or reproducible.

54
Q

Methodological Qualities -
Transparency

A

The researcher should make sure that designs, data, analysis and derivation of conclusions are accessible and inspectable.

55
Q

Methodological Qualities -
Ethics

A

The researcher should carefully consider what is right and wrong when collecting, analyzing and reporting data.

56
Q

Contextual inquiry principles

A
  • Context: being close to the interviewer and activity
  • Partnership: collaboration to understand user’s actions
  • Interpretation: attempt to create meaning of activity
  • Focus: should aim for depth
57
Q

Personalisation

A

User making non-functional (i.e. appearance) changes to the interactive system.

Can lead to cognitive, social and emotional effects.

58
Q

Tailoring

A

User intentionally modifies the functionality of the system.

Types are customization, integration, and extension.

59
Q

Appropriation Guidelines

A
  • Allow interpretation
  • Provide visibility
  • Expose intentions
  • Support, not control
  • Pluggability and configuration
  • Encourage sharing
  • Learn from appropriation
60
Q

Appropriation Guidelines -
Allow interpretation

A

Avoid fixing meanings, but include elements in the system that allow users to add their own meanings.

61
Q

Appropriation Guidelines -
Provide visibility

A

Providing extra visibility about a system’s functionality and status can help users understand how to develop appropriations.

62
Q

Appropriation Guidelines -
Expose intentions

A

Tell to users the intended purpose of a function for clarity. Guards against subversion or use for the opposite purpose than intended.

63
Q

Appropriation Guidelines -
Support, not control

A

The design should not force users to do tasks in a particular way (control), but should allow for flexible variation.

64
Q

Appropriation Guidelines -
Pluggability and configuration

A

The design should allow users to create their own systems or modify systems for their own purposes.

65
Q

Appropriation Guidelines -
Encourage sharing

A

By encourage users to share their appropriations, they can be reappropriated by others.

66
Q

Appropriation Guidelines -
Learn from appropriation

A

Observing the ways users appropriate technology can provide a source of insights for product development.

67
Q

Factors that affect appropriation

A
  • Designers are unlikely to understand all the tasks or environments in which a product is used.
  • Users’ needs and situations change
68
Q

Appropriation moves

A
  • Actively champion the use of an interactive system
  • Use a different way of accomplishing work due to a perceived deficiency in the system
  • Criticizing the interactive system by comparing it to other methods
  • Interpreting the system, by explaining the meaning to others
  • Attempt to make others reject using the interactive system, or prevent its usage
  • Being slow in taking up the system or otherwise contributing to inertia in its uptake.
69
Q

Goal-directed interaction

A

Goal-directed action interaction is viewed as a dialogue where the user wants to achieve a goal in the system.

70
Q

Gulf-of-execution

A
  1. Goals
  2. Form intention
  3. Specify action
  4. Execute action
  5. Environment
71
Q

Gulf-of-evaluation

A
  1. Environment
  2. Perceive world
  3. Interpret state
  4. Evaluate state
  5. Goals
72
Q

Cognitive dimensions of notation

A
  • Viscosity: resistance to change
  • Visibility: ability to view components easily
  • Premature commitment: constraints on order of doing things
  • Hidden dependencies: important links between hidden entities
  • Role-expressiveness: purpose of an entity is readily inferred
  • Error-proneness: notation invites mistakes and gives little protection to system
  • Consistency: similar semantics are expressed in similar syntactic forms
  • Diffuseness: verbosity of language
73
Q

System-centrical evaluation criteria for automation

A
  • Automation reliability
  • Costs of decisions and action outcomes
74
Q

System-centrical evaluation criteria -
Automation reliability

A
  • Extent to which the system is effective in automation
  • True positive rate (sensitivity)
  • False positive rate (false alarm rate)
  • A high false alarm rate can cause fatigue among users and foster distrust in the system
75
Q

System-centrical evaluation criteria -
Cost of D&AO

A

Potential benefits of automation has to be compared with and weighted against any possible disadvantages

76
Q

Solution neutral problem statement

A

Expresses overall objective as a problem statement that avoids framing it in solution-dependent terms.

77
Q

High-level verification methods

A
  • Demonstration
  • Inspection
  • Test
  • Analysis
78
Q

Deriving a solution neutral problem statement

A
  1. Remove requirements and constraints that have no direct relationship to addressing the problem statement.
  2. Transform quantitative statements into qualitative statements.
79
Q

Ill-defined problem

A

An ill-defined problem has no clear objectives, or has too many constraints that may be mutually dependent in some complex or unknown manner.

80
Q

Ill-defined problem

A

An ill-defined problem has no clear objectives, or has too many constraints that may be mutually dependent in some complex or unknown manner.

81
Q

Goal refinement

A
  • Take perspectives to redefine the problem and reinterpret it in the light of a potential solution under consideration.
  • May involve relaxing assumptions, or refining them in light of knowledge and insight gained by considering a potential solution.
  • Study technical and organizational materials to understand the constraints, and discuss
    them with relevant stakeholders.
82
Q

Well-defined design task

A
  • Design decisions
  • Design space
  • Objectives
  • Constraints
83
Q

Process models

A
  • Human factors engineering
  • Agile methodology
  • Usability engineering
84
Q

Human factors engineering

A

The design and construction of safe and reliable interactive technologies.

Emphasis on safety and mitigation of human error.

85
Q

User-centered design processes

A
  • User research
  • Formation of design goals (from requirements)
  • Generation of design ideas
  • Evaluation
86
Q

Usability engineering

A

A lifecycle model to help developers not only launch products but to keep them updated.

87
Q

Usability engineering -
Drawbacks

A
  • Limited support for design as a creative activity
  • Glorified trial and error
88
Q

10 phases of usability engineering

A
  • Know the user
  • Competitive analysis
  • Setting usability goals
  • Design stage
  • Coordinated design
  • Guidelines and heuristics
  • Prototyping
  • Empirical user testing
  • Iterative design
  • Collecting feedback from the field
89
Q

User research methods

A
  • Open-ended interview
  • Contextual inquiry
  • Observation
  • Ethnography
  • Surveys
  • Diaries
  • Log file analysis
  • Analysis of archival data
90
Q

Between-subjects design

A
  • Participants are exposed to one condition in the experiment.
  • (+) No learning effect across conditions
  • (-) Individual error is not controlled -> more participants needed
91
Q

Within-subjects design

A
  • Participants exposed to all conditions in the experiment.
  • (+) need fewer participants and individual error is controlled within the participant
  • (-) learning effect across conditions -> if asymmetric then the design is invalid
92
Q

Internal validity

A

The measured changes in the dependent variables are solely due to manipulations in the independent variables.

93
Q

External validity

A

The experimental task is generalizable to a wide variety of usage contexts outside the experimental setting.

94
Q

Closed-loop control

A
  • The user is using visual feedback to guide interaction.
  • Slow, deliberate and accurate movements.
95
Q

Open-loop control

A
  • The user is recalling an action from motor memory to perform an action.
  • Fast but imprecise
96
Q

Marking menu

A

A pie menu that has been augmented with a mechanism for performing a menu shortcut by articulating a gesture, a “mark”, which has the same movement pattern as a menu selection through the pie menu.

97
Q

Morphological chart

A

A table in which the rows map specific functions to a set of candidate function carriers, often referred to as solutions.

98
Q

GOMS Task Analysis

A
  • Goals: aims of the user
  • Operators: performable actions in the interface
  • Methods: sequences of sub-goals/operators used to achieve a particular goal
  • Selection rules: rules used to choose a method to achieve a goal
99
Q

Keystroke-level modeling (KLM-GOMS)

A

A simple mathematical model to assess the performance of tasks, be predicting task completion time of experienced users.

100
Q

KLM-GOMS standard operators

A
  • K: Key press (0.3 s)
  • T(n): Type sequence of n characters (n × K s)
  • P: Point mouse to display target (1 s)
  • B: Press/release mouse button (0.1 s)
  • BB: Click mouse button (0.2 s)
  • H: Move hands between mouse/keyboard (0.5 s)
  • M: Mental act of routine thinking or perception (1.2 s)
  • W(t): Wait time for system response (t s)
101
Q

Limitations of KLM-GOMS

A
  • Assumes error-free expert behaviour
  • Ignores learning curve effects
  • Assumes reliable fixed time estimates for all operators
102
Q

Wicked problem

A

Not possible to identify a well-defined design task through goal refinement.

103
Q

System-mapping techniques

A
  • Process diagram
  • System diagram
  • Task diagram
  • Communication diagram
  • Organization diagram
  • Information diagram
104
Q

Systems approach considerations/failures

A
  • Consider the environment the system operates in.
  • Understand non-technical factors
  • Address planned and unplanned interactions between components in system and interactions with environment.
  • Part of wider user experience system.
105
Q

Systems approach principles

A
  • Define, revise, and pursue the project
  • Think holistic
  • Follow a systematic procedure
  • Be creative
  • Take account of the people
  • Manage the project and the relationship
106
Q

User-centered design processes -
Features

A
  • HFA: focus on safety and managing human error
  • Agile: focus on rapid iteration and changing requirements
  • Usability engineering: largely trial and error