Psycholinguistic : Neurophysiology of language Flashcards

1
Q

Components by hemispheres

A

LEFT
Phonology
Semantics
Input-output lexicons
Syntax morphology

RIGHT
Prosody
Coarse semantics (emotion terms, not literal meaning)
Discourse context

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Process specific neural networks

A

1 - Acoustic-phonological analysis
2 - Initial syntactic processes
3 - Computation of semantics & syntactic relations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Different processes of language

A

Phonology (sound)
Syntax (structure)
Semantics (meaning)
Pragmatics (use of language in conversation)
Discourse
Emotional comprehension / production

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Acoustic-phonological analysis

A

N1 / N100
Primary auditory cortex - STG
Brain resources for language and music

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Initial syntactic process

A

ELAN
(early left anterior negativity)

Inferior frontal gyrus (in particular Broca’s area, not just frontal operculum)
Broca’s aphasia -> non fluent (no verbs, no adverbs)
Articulation involves motor cortex

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Computation of semantics and syntactic relations

A

N400
Temporal and frontal both interact with each other

> Anterior and posterior temporal cortex as well as in the Inferior Frontal Gyrus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Two prosodies

Prosodic cues

A

Linguistic prosody
Pauses btw the words, tone in statement vs question

Emotional prosody
crucial in social interactions
Intonation

Prosodic cues
High colinearity among cues
-Pitch and pitch modulation
-Voice intensity
-Voice quality
-Speech rate and rhythm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Speech ?

A

One of the more complex task but automatic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why ERP

A

“It is hard to see the wood through the trees” Friederici

High temporal resolution (compared to MRI)
The knowledge of the begining of the stimulus is crucial
Many stimuli in order to mean them (and get ride of noise)
Peak = specific process linked
Nasion-Inion
Odd = left
Even = right
PB with language production > motor artefacts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

ERP
component identification

A

polarity - amplitude - scalp location - sensitivity to experimental manipulations

Amplitude : peak or mean (less sensitive to latency jitter)

C1 first component from visual field

Later windows : peaks more diffused

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

ERP
More electrodes ?

A

Be careful coz
= more impedance (decrease with conductive gel)

Cherry picking (too much data)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

ERP effects reported for acoustic-phonological

A

N1

Mismatch negativity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Mismatch Negativity

A

Measure the discrimination of acoustic and phoneme categories (bayesian probaility)

Attention independant - doesn’t require any task from the patient (useful with dyslexia)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

N1

A

75-150ms

Component form primary and secondary auditory cortex (can be affected by frontal)

Mostly exogenous but top-down influence

Function significance
correlates with identification of phonemes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

MMN

A

150-250 ms

Mismatch negativity

auditory cortex

Functional significance
It reflects the discrimination of acoustic and phoneme categories (Pülvermuller)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

N400

A

Related to semantic congruency

Procesing of semantics at word and sentence level

Reflects the difficulty of lexical-semantic intergration

Increases with
non-word (ertertrrss) / pseudoword (murle)
unfit pairs “Nurse / Hospital” < “Nurse / Apple”

Decreases with as the sentence unrolls due to increased predictability of the upcoming word

17
Q

P600

A

processes of syntactic reanalysis and repair

Observed in situation of :
processing of syntactic anomalies
Temporarly ambigous sentences

  • ” The judge believed the defendant was lying. “*
  • The defendant* has been analyzed as an object, but was forces reanalyzing it as a subject
18
Q

A model of language processing

A

The neurotemporal dynamic of language comprehension
3 sequential phases

Phase 1
An initial phrase structure on the basis of word category information is built.

Phase 2
The relation btw the verb and its arguments is computed to assign the thematic roles in sentence.

Phase 3
The final interpretation takes place

19
Q

Language process : Phase 1

A

Phase 1
An initial phrase structure on the basis of word category information is built.

> This process is highly automatic, independent of semantic and verb argument information, and independent of task demands.

> N100, MMN

20
Q

Language process : Phase 2

A

Phase 2
The relation btw the verb and its arguments is computed to assign the thematic roles in sentence.

Morphosyntactic information (subject-verb agreement, LAN), case information (LAN or N400, depending on the particular language), and lexical selectional restriction information (N400) are taken into consideration to achieve assignment of the relation between the different elements in a sentence.

21
Q

Language process : Phase 3

A

Phase 3

The final interpretation takes place, with semantic and syntactic information being taken into account and mapped onto world knowledge.

> P600

22
Q

Non verbal expression of emotion

A

“The voice is an auditory face”

23
Q

Voice affective information

A

EMOTIONAL PROSODY
set of acoustic parameters of speech directly influenced by affect

Frequency info : pitch
Temporal info : Speech rate or rythm
Loudness info : intensity

Several acoustic cues

24
Q

Multi stage model of emotional processing

A

1 & 2 : two steps of emtional processing (2 ERP signatures)

1 - Sensory processing
auditory processing region : cortex → STS
N100

2 - Detection of emotionaly significant acoustic cues
Ventral auditory pathway : bilateral STG → anterior STS
Important because influences the velocity of other processings (important to happen early)
P200

3 - Cognitive evaluation of emotional meaning
Semantic processing : IFG - left hemispehere
400 ms

Extremely fast interaction btw processes to understand speech.

25
Q

The differentiation between neutral and emotional cues

A

Occurs rapidly

26
Q

ERP results of emotional speech

A

Study with sentences (Pinheiro, 2013)
Prosody effect : N100 : neutral > angry // P200 : neutral>happy>angry

Study with sentences (Pinheiro, 2014)
Prosody effect : N100 : angry > neutral // P200 : neutral>happy

ERPs can differentiate various prosody types at an early processing stage.
– The N100 and P200 components index two stages of vocal emotional processing and are sensitive to valence.

Detecting emotional salience from acoustic cues is a rapid process.
– The first differentiation between neutral and emotional prosody occurs at the level of N100, long before the production of an utterance is complete.

27
Q

Vocal emotional processing

A

Highly automatic process

Emotional information cannot be ignored even when it is not task-relevant.

Can be impaired in diseases like Schizophrenia

28
Q

Verbal emotional info

A

Emotionally relevant adjectives are processed spontaneously and selectively

– Healthy subjects may have a natural bias towards pleasant information

N400: better semantic integration of pleasant words

29
Q

Mood affects language

A

Neutral mood
– Predictive role of sentence context (N400 to expected words ≠ N400 unexpected words)
– Role of memory structure (WCV < BCV)

Positive mood:
– Facilitative effects on semantic processing (WCV were processed as if they were EW)

Negative mood:
– Narrowing effects on available semantic choices and bias towards contextual information (WCV=BCV)

30
Q

ERPs can provide

A

Insights into the speed with which vocal emotional information is decoded

Hints at the processing stages involved

Information on the functional architecture of the auditory system as applied to vocal emotional processing