Language & the Brain Flashcards
Neural Structural Anatomy in deafness
Secondary auditory cortex & associated visual cortex is used to process visual information
Do not die but rather use to input other modalities
Types of HI
Conductive: outer/middle ear inner ear
Glue ear
Sensorineural: damage to cochlea or auditory nerve
Levels of HI
Mild = 24-40dB Moderate = 41-70 dB Severe = 71-95 dB Profound = 95+
=> Anything about 50dB don’t hear speech
Sign Language consists of:
Discourse Semantics Syntax Morphology Phonology
=> Same linguistic level as speech
Sign language neurology:
Comparing Reading in Hearing non-signers vs. Deaf Signers
Hearing:
Left lateralisation
Deaf:
Right side involvement
More involvement of temporal lobe & iFG
Sign language neurology:
Audiovisual-speech in deaf native signers vs. hearing non-signers
MacSweeney et al., Brain, 2002
BSL in deaf signers
vs. spoken English in hearing non‐signers
>inferior prefrontal regions bilaterally (including Broca’s area)
>superior temporal regions bilaterally (including Wernicke’s area).
>Similar lateralisation
Hearing:
->greater activation in the primary and secondary auditory cortices
- > Left‐ temporal auditory regions may be privileged for processing heard speech
- Absence of auditory input this region can be recruited for visual processing
BSL in deaf signers
->enhanced activation in the posterior occipito‐temporal regions (V5)
Occipitotemporal region = reflecting the greater movement component of BSL.
Core LH language network
> Inferior prefrontal regions bilaterally (including Broca’s area)
> Superior temporal regions bilaterally (including Wernicke’s area).
->Regardless of modality
RH and Language
> Support narrative level discourse
–>Prosody, affect, facial expression, and discourse structure.
> Deaf:
- ->RH activation in the inferior frontal gyrus and
- ->superior temporal sulcus was greater for sentences containing narrative devices
Parietal Lobe & sign
Special role for spatial processing
> L. Inferior Parietal Lobule = phonological production errors
> L. Sup. PL activation = memory for signs rather than words
> L. IPL & SPL = greater activation for sign than speech production
- Greater activation in SPL for spatial BSL sentences
- Greater activation in SPL for sign rather than speech
Reading & Lip-reading
Lip reading plays a role in reading development (?)
->Mediated by phonological awareness (?)
They get a phonological representation of the spoken word
Information about phonology in deafness comes from:
Articulation
Orthography (writing)
Residual auditory input
Reading & Neurology
Same regions for deaf BSL users & hearing
>Left lateralised front-parietal network
Therefore – phonological network is irregardless of modality (speech or BSL) therefore Supramodal
Activation was influenced by:
- Language – BSL/English
- Hearing status
- Age of BSL acquisition
- Non-native signers activated L. iFG more than native
RCT: Lipreading training & Reading development
Computer program – x10m per day, 4 days per week
Speech reading: gains at T3 follow-up
Phonological awareness: gains at T2 and larger gains at T3
Phonological representations: Gains at T3
–> More time needed for downstreaming?