Eye-Tracking and ERPs Flashcards
Visual world paradigm
Monitoring attention in the visual scene during spoken comprehension
Two key validations for visual world paradigm
1) Speech-mediated eye movements to relevant visual targets are closely timelocked to the linguistic stimuli (happening ~200ms later)
2) These eye movements index a number of underlying comprehension mechanisms
Two modes of measuring overt attention
Active and passive tasks
How are looking patterns indicative of underlying cognitive processes? Two hypotheses
1) Coordinated Interplay Account
2) Joint Representation Account
Coordinated Interplay Account
Posits three processes:
(i) searching for visual referents of spoken referring expressions
(ii) grounding referring expressions with objects and events in the scene
(iii) use of visual scene to confirm or inform the linguistic interpretation
Joint representation of linguistic meaning and visual information
- Anticipated linguistic meaning and visual scene information are not distinguishable
from each other - Interact with each other and updating occurs on the joint representation
Traditional statistical tests for visual world paradigm
ANOVA
But, lots of criticisms
Other methods being explored
Measures in linguistics
Behavioural (offline or online)
Neuroscientific (high spatial or temporal resolution)
Corneal reflection
Reflected light, measured relative to pupil
Key measure in visual world paradigm
Fixations
Three more common measures
First pass
Regression path
Total time
ERP measures
Mean amplitude
Latency
Polarity
Topography
Exogenous and endogenous ERPs
Exogenous: within 100ms, provoked by external stimulus, bottom up
Endogenous: after 100ms, reflect cognitive processes, top down
Mismatch negativity (MMN)
Result of subtracting ERP of frequent, standard stimuli from ERP of ‘deviant’ stimuli
N400
Graded index of semantic surprise Variation of (negative) polarity peaking around 400ms after stimulus onset