Eye Movement and Gaze as Input Flashcards
Why Gaze as Input?
- Gaze is a natural pointer
- Why not use it instead of a mouse?
Eye-Tracking
- Illumination of the eye with infrared light
- Glints: reflections on the cornea
- Use computer vision to track eye-in-head movement as offset between pupil center and corneal reflection (PCCR method)
- Calibration to a screen by looking at predefined points
- Output: stream of coordinates
- Input events: fixations, dwell time
Gaze pointing is not like mouse pointing
- Midas Touch
- We don’t want to trigger input on everything we happen to look at
- Gaze does not have an obvious ‘click’ method
- Accuracy / Precision
- Gaze fixations haven natural jitter and eye tracking is not perfect
- Expressiveness
- We can quickly point our eyes at an object, but we cannot move or manipulate objects with gaze
Gaze Interaction Research at Lancaster
- Gaze and Hand
- Combining input and taking advantage of eye-hand coordination
- Gaze and Motion
- Understanding how the eyes interact with moving objects and designing interfaces that are motion-based
- Gaze and Body
- Understanding that gaze involves head and body movement, and taking advantage of the coordination of eye, head and body
Gaze and Hand
- Gaze selects, Touch manipulates
- Gaze is faster and less effort
- Touch is more expressive
Gaze and Hand
- The eyes naturally look at what we want to manipulate
- Gaze-based model switching
- The hand stays on the task
- Gaze modulates manual input
- The eyes stay where the input focus is
- The hand moves in and out of the gaze focus, for different input tasks
- Gaze naturally precedes the manual action
- Gaze seamlessly extends the reach of the hands
Gaze and Hands in 3D
- The hands do the work
- Gaze selects the closest target
- Skilled tasks
- Complex eye-hand coordination
- Rapid switching between objects that are juggled
- Reach across wide ranges
- Seamless manipulation from very small to very large objects
Gaze and Hand - Key Points
- Leveraging natural eye-head coordination
- Design to leverage complementarity
- Gaze extending manual reach
- Modulating direct/indirect modes
- Careful coupling of modalities at discrete points
Gaze and Motion
In eye-tracking, we like to keep things still
* Fixations: when the eye is in a relatively still in the head
* Movement is suppressed and treated as noise
In natural gaze, we perform stabilizing eye movements
* Smooth Pursuit Eye Movement (SPEM) enables us to focus on objects that are moving
* Vestibulo-Ocular Reflex (VOR) stabilizes gaze when we move our head and body
* Optokinetic Reflex (OKN) stabilizes gaze when the visual scene is in motion
Pursuits: Selection by motion
- Objects moving in display space
- Eye movement tracked in its own space
- Correlation over moving window
- Based on natural smooth pursuit
Pursuits
- Implicit selection, based on natural attention
- No calibration procedure
- Walk up and use
- Take turns
- Objects of any size
- Matched by motion, not by position
Orbits
- Widgets “clickable” by pursuit
- Size-invariant but distinguished
by direction, phase, velocity
Gaze and Motion - Key Points
- What is hard with our hands is easy with our eyes
- Synchronising with external motion
- Decoupling of input and output
- Motion correlation as selection
principle, also with other modalities
- Motion correlation as selection
*Control / awareness
* Explicit control: widgets displaying motion as trigger
* Implicit input: animate content to make it attention-aware
* Unaware input: use motion to invoke pursuit response
Gaze and Body
Gaze = the eye direction relative to the world = eyes + head + body
Gaze and Body #2
- In HCI, the interplay of eye, head and body has been completely ignored
- Desktop eye tracking research::
Gaze = eye movement while head movement is suppressed or filtered - Extended Reality (VR/AR) research:
Gaze = the head direction in abstraction of eye movement