Psycholinguistic : Neurophysiology of language Flashcards
Components by hemispheres
LEFT
Phonology
Semantics
Input-output lexicons
Syntax morphology
RIGHT
Prosody
Coarse semantics (emotion terms, not literal meaning)
Discourse context
Process specific neural networks
1 - Acoustic-phonological analysis
2 - Initial syntactic processes
3 - Computation of semantics & syntactic relations
Different processes of language
Phonology (sound)
Syntax (structure)
Semantics (meaning)
Pragmatics (use of language in conversation)
Discourse
Emotional comprehension / production
Acoustic-phonological analysis
N1 / N100
Primary auditory cortex - STG
Brain resources for language and music
Initial syntactic process
ELAN
(early left anterior negativity)
Inferior frontal gyrus (in particular Broca’s area, not just frontal operculum)
Broca’s aphasia -> non fluent (no verbs, no adverbs)
Articulation involves motor cortex
Computation of semantics and syntactic relations
N400
Temporal and frontal both interact with each other
> Anterior and posterior temporal cortex as well as in the Inferior Frontal Gyrus

Two prosodies
Prosodic cues
Linguistic prosody
Pauses btw the words, tone in statement vs question
Emotional prosody
crucial in social interactions
Intonation
Prosodic cues
High colinearity among cues
-Pitch and pitch modulation
-Voice intensity
-Voice quality
-Speech rate and rhythm
Speech ?
One of the more complex task but automatic
Why ERP
“It is hard to see the wood through the trees” Friederici
High temporal resolution (compared to MRI)
The knowledge of the begining of the stimulus is crucial
Many stimuli in order to mean them (and get ride of noise)
Peak = specific process linked
Nasion-Inion
Odd = left
Even = right
PB with language production > motor artefacts
ERP
component identification
polarity - amplitude - scalp location - sensitivity to experimental manipulations
Amplitude : peak or mean (less sensitive to latency jitter)
C1 first component from visual field
Later windows : peaks more diffused
ERP
More electrodes ?
Be careful coz
= more impedance (decrease with conductive gel)
Cherry picking (too much data)
ERP effects reported for acoustic-phonological
N1
Mismatch negativity
Mismatch Negativity
Measure the discrimination of acoustic and phoneme categories (bayesian probaility)
Attention independant - doesn’t require any task from the patient (useful with dyslexia)
N1
75-150ms
Component form primary and secondary auditory cortex (can be affected by frontal)
Mostly exogenous but top-down influence
Function significance
correlates with identification of phonemes
MMN
150-250 ms
Mismatch negativity
auditory cortex
Functional significance
It reflects the discrimination of acoustic and phoneme categories (Pülvermuller)
N400
Related to semantic congruency
Procesing of semantics at word and sentence level
Reflects the difficulty of lexical-semantic intergration
Increases with
non-word (ertertrrss) / pseudoword (murle)
unfit pairs “Nurse / Hospital” < “Nurse / Apple”
…
Decreases with as the sentence unrolls due to increased predictability of the upcoming word
P600
processes of syntactic reanalysis and repair
Observed in situation of :
processing of syntactic anomalies
Temporarly ambigous sentences
- ” The judge believed the defendant was lying. “*
- The defendant* has been analyzed as an object, but was forces reanalyzing it as a subject
A model of language processing
The neurotemporal dynamic of language comprehension
3 sequential phases
Phase 1
An initial phrase structure on the basis of word category information is built.
Phase 2
The relation btw the verb and its arguments is computed to assign the thematic roles in sentence.
Phase 3
The final interpretation takes place
Language process : Phase 1
Phase 1
An initial phrase structure on the basis of word category information is built.
> This process is highly automatic, independent of semantic and verb argument information, and independent of task demands.
> N100, MMN
Language process : Phase 2
Phase 2
The relation btw the verb and its arguments is computed to assign the thematic roles in sentence.
Morphosyntactic information (subject-verb agreement, LAN), case information (LAN or N400, depending on the particular language), and lexical selectional restriction information (N400) are taken into consideration to achieve assignment of the relation between the different elements in a sentence.
Language process : Phase 3
Phase 3
The final interpretation takes place, with semantic and syntactic information being taken into account and mapped onto world knowledge.
> P600
Non verbal expression of emotion
“The voice is an auditory face”
Voice affective information
EMOTIONAL PROSODY
set of acoustic parameters of speech directly influenced by affect
Frequency info : pitch
Temporal info : Speech rate or rythm
Loudness info : intensity
→ Several acoustic cues
Multi stage model of emotional processing
1 & 2 : two steps of emtional processing (2 ERP signatures)
1 - Sensory processing
auditory processing region : cortex → STS
N100
2 - Detection of emotionaly significant acoustic cues
Ventral auditory pathway : bilateral STG → anterior STS
Important because influences the velocity of other processings (important to happen early)
P200
3 - Cognitive evaluation of emotional meaning
Semantic processing : IFG - left hemispehere
400 ms
Extremely fast interaction btw processes to understand speech.
The differentiation between neutral and emotional cues
Occurs rapidly
ERP results of emotional speech
Study with sentences (Pinheiro, 2013)
Prosody effect : N100 : neutral > angry // P200 : neutral>happy>angry
Study with sentences (Pinheiro, 2014)
Prosody effect : N100 : angry > neutral // P200 : neutral>happy
ERPs can differentiate various prosody types at an early processing stage.
– The N100 and P200 components index two stages of vocal emotional processing and are sensitive to valence.
Detecting emotional salience from acoustic cues is a rapid process.
– The first differentiation between neutral and emotional prosody occurs at the level of N100, long before the production of an utterance is complete.
Vocal emotional processing
Highly automatic process
Emotional information cannot be ignored even when it is not task-relevant.
Can be impaired in diseases like Schizophrenia
Verbal emotional info
Emotionally relevant adjectives are processed spontaneously and selectively
– Healthy subjects may have a natural bias towards pleasant information
N400: better semantic integration of pleasant words
Mood affects language
Neutral mood
– Predictive role of sentence context (N400 to expected words ≠ N400 unexpected words)
– Role of memory structure (WCV < BCV)
Positive mood:
– Facilitative effects on semantic processing (WCV were processed as if they were EW)
Negative mood:
– Narrowing effects on available semantic choices and bias towards contextual information (WCV=BCV)
ERPs can provide
Insights into the speed with which vocal emotional information is decoded
Hints at the processing stages involved
Information on the functional architecture of the auditory system as applied to vocal emotional processing