Lecture 6 - Very Mobile HCI Flashcards
What is fragmented attention?
When 2 things are competing for our focus , so we cannot devote 100% attention to a single thing, but rather have divided attention.
What is fragmented attention in the mobile hci context?
When our attention is divided between the mobile device and another stimuli.
What is the resource competition framework?
The idea that we have limited cognitive resources and hence we often have to prioritize attention.
What is situational impairment?
when the context restricts a person’s ability to interact with the device. This is typically physical, but not always (e.g. someone talking to you)
Situation when we have competing demands and hence fragmented attention?
walking and looking at our phone
Is fragmented attention bad or good?
Bad, as it can lead to dangerous situations depending on the context.
What do users face in the modern world?
situational impairment
When we are facing situational impairment, what input and output modality is greatly affected?
touch (screen), as the glare might stop us from seeing the screen, while movement might make touch input unstable
What are ‘very mobile’ scenarios?
ones that have increased physical and cognitive demands e.g. driving
What can happen when we are divided between tasks and prioritize one of them.
Our attention can fully divert to the prioritized task.
What is the implication of diverting attention?
potentially dangerous situations e.g. choosing to use phone without steering the car while driving
Very mobile scenarios are more likely to
divide our interaction and cognitive resources
divert our attention
What things demand attention while cycling?
- situational awareness
- road users
- pedestrians
- traffic signals
- road rules
- surface / terrain
- navigation
- scenery
What things are physically demanding when cycling?
- exertion
- steering
- breaking
- gear shifting
- balance
Is it hard to use a mobile device on a bike?
yes, atleast the standard way , as we have impact interaction and hence device instability , while also not being completely hands free
What things demand attention in a running example?
- situational awareness
- pedestrians
- traffic
- surface / terrain e.g. potholes
- road crossings
- navigation
- scenery e.g. obstacles
What are the physical demands of running?
- exertion
- gait (impact of stepping and arms swinging, massive effect on interaction)
- balance
- arms swinging (as mentioned in gait (kind of part of gait))
What are the things that demand our attention while driving?
- situational awareness
- pedestrians
- other road users
- traffic signals
- road rules
- navigation
- scenery
- dashboard
- displays
- devices
Why is having devices near while driving dangerous?
As it fragments our attention due to increased mental demands (refers to resource competition framework, we might have to prioritize one or the other)
What are the physical demands of driving?
- gear changes
- pedal and wheel control
- maneuver signalling
(hands are not free for input)
How should we design for mobility?
We need to anticipate that standard interaction capabilities will be limited, so we need to think about:
- alternative ways of input
- alternative ways of output
- perhaps interaction via other sensors e.g. camera via gestures
Your choice will depend on the scenario.
We should also minimize visual attention demands by making the UI simple, clear and minimize available actions while attention is divided.
Without touchscreen e.g. using gestures reduces what?
- reduces visual attention necessary
- reduces physical contact necessary
- reduces potential precision (depending on modality)
What are the alternative input modalities to touchscreen?
- air gestures
- speech
- physical controls (no visual attention required , can be felt when activated, so immediate feedback-> less need for stability and precision as grasping surrounding area can stabilize the device)
- touchscreen gestures (less precision needed, reduced visual attention to locate target)
- contact-less interactions (as mentioned above this is air gestures and speech -> have a lot of usability issues)
Why can contactless interactions be bad
as people’s attention might be drawn away if they keep trying to do something, but not knowing how to do it properly
-> prioritization and hence risk of diverting attention
Possible output modalities?
- audio
- haptic feedback
What questions should we ask when designing output modalities?
- can they be perceived
- can they be understood
What happens if our output is not perceived or understood?
Users visual attention might divert to check
Possible alternative for output to avoid diversion?
visual feedback in front e.g. google glass
What do we want to achieve in very mobile situations?
interactions that minimize the need (by minimizing visual demands to avoid attention fragmentation) for the user to divert resources (attention and concentration)