lecture 10-The development of language comprehension Flashcards
language comprehension or production
Language Comprehension precedes Language Production
Language development involves the mastery of three components:
Language generativity
Semantic development
Pragmatic development
Language Generativity
Phoneme: The basic unit of sound used to produce language.
Phonological development: The acquisition of knowledge about the sound systems of one’s own language.
Morpheme: The smallest unit of meaningful sound, usually one or two phonemes.
semantic development
Learning to express meaning in language; includes word learning.
Syntax: The rules for the ways in which words can be combined to make sense.
Syntactic development: Learning the rules for combining words in a given language
Pragmatic Development
- Acquiring knowledge about how language is used, such as the rules for conversation.
- Adults have metalinguistic knowledge, knowledge about the properties of language and language use, that children do not have
Language and the Brain
Language is a species-specific behavior.
Only humans acquire language in the normal course of development, although some primates have been taught to sign and recognize words.
Language seems to be localized in the brain.
For 90% of right-handed people, language is primarily controlled in the left hemisphere of the cerebral cortex.
Is Language Acquisition “Special”?
Until the 1940s and 1950s, general view that learning language was like any other form of learning.
Behaviourist traditions - form associations between sounds and meanings.
But various experiences (mainly failures…) led to the realisation that something different was going on with language.
Critical Period for Language Development
There seems to be a critical period (between the ages of 5 and puberty) during which language develops readily and after which language acquisition is harder and less successful.
Evidence:
1.The effect of language deprivation during the critical period
2.The effects of damage to language areas in the brain (children recover more readily than adults).
3.The ages at which a second language is acquired
Language Deprivation
Chelsea: First language experience at 31 years
Isabelle: First language experience at 6 years
Genie: First language experience at 13.5 years
Victor: (the “wild boy of Aveyron”)
First language experience at 12 years
Brain Damage
Differences in recovery rate and extent of recovery in patients with aphasia:
children recover faster than adults
more likely than adults to show full recovery
Differences in progress in language acquisition before vs. after puberty in mentally retarded
Problem: we are not dealing with normal brains
Second Language Acquisition
Johnson and Newport (1989): Korean and Chinese people who had come to the US at different ages:
early arrivals (before age 15)
late arrivals (after age 17)
Matched in “experience”
Judged whether spoken sentences were grammatical or not
Participants who began earlier did better:
before age 7, like native speakers
Decline in performance began at age 8 - even earlier than puberty – but decline was gradual
Language and the Human Environment
Having a human brain is not sufficient for language to develop.
As already discussed, language requires exposure to other people and using language with them.
Caregivers and siblings begin to communicate through language with infants almost from birth.
Speech Perception
Acquiring language involves listening and talking and understanding what others are communicating. Involves:
Categorical Perception
Prosody
Speech Perception – Categorical Perception
- Infants and adults can perceive speech sounds as belonging to discrete categories.
- Phonemic contrast ability appears to be innate, present at birth and independent of experience.
- Infants can make sharp distinctions between speech sounds.
- Infants can distinguish new sounds from ones they already know.
How can we examine what babies hear?
Eimas et al (see Eimas 1985): clever non-nutritive sucking methodology.
Babies are attached to a nipple, which is in turn attached to a tape recorder.
Every time they suck, they hear sounds from the tape recorder.
Babies soon learn to suck at high amplitudes to hear the tape recorder.
Changes in sucking rate tell us what contrasts babies are sensitive to.
Keep the same sound and the babies will gradually slow down their sucking.
When they perceive change, quickly speed up again…
What do babies hear? Method
Eimas et al used synthetic syllables of the “pa”/”ba” sort, with controlled voice onset times (VOTs).
VOT for “ba” is approx. 15ms
VOT for “pa” is approx. 100 ms
Baby would habituate to a particular VOT, then sound would be changed to a different VOT.
Question: What changes would the baby notice?
What do babies hear? Results
One month old babies noticed all and only the changes an adult listener would notice.
e.g. sucking rate did not change for changes in VOT from 0 - 20ms or from 40 - 60ms.
but DID change from 20 -40 ms, corresponding to the “pa”/”ba” boundary for adults.
Phoneme Discrimination
Kuhl et al (1992) Babies turned their heads to sounds that changed by a phoneme (e.g. pop -> peep) No head turn to sounds that differed in: voice length intonation
A Biological Endowment?
Categorical phoneme perception ability in infants suggests a hard-wired module.
Why else would a child respond categorically to gradations of some acoustic properties but not others?
Prediction of Universality
If the phonetic module is a biological endowment, should be universal.
Would not expect differences in perception across infants of different linguistic backgrounds.
This turns out to be the case:
Babies as Universal Phoneticians
Infants outperform adults in categorical perception in an important respect:
Infants are sensitive to any phonetic discrimination in any language.
Adults are only sensitive to the relevant phonetic discriminations of their own language.
example of universal phoneticians
Japanese babies can hear distinction between /r/ and /l/, even though Japanese adults cannot (Eimas, 1985).
English-speaking babies can hear contrasts relevant in languages like Czech and Hindi that English speaking adults cannot (Werker & Tees, 1984).
In the case of speech, older does not mean wiser…
When Does This Change?
~8 - ~12 months, babies no longer discriminate sounds not used in their own language (Werker, 1989)
Could be due to:
Exposure
Starting to talk. As babies attend selectively to sounds that affect meaning in their own language, they stop noticing sounds not used
Speech Perception – Prosody
The characteristic rhythm, tempo, cadence, melody, intonation pattern, etc., with which language is spoken.
Infants are sensitive to the prosody of the languages they hear.
Baby’s Got Rhythm…(Nazzi et al., 1998)
Nazi et al., 1998: 5 day old French babies can:
distinguish stress-timed (English) and syllable-timed (Japanese) languages (Nazzi et al, 1998), but NOT between two stress-timed languages (English and Dutch)
distinguish English from Italian and Spanish, but Italian from Spanish (which have same timing).
Mehler et al., 1988: Four day old French babies suck harder to hear French than Russian.
Seem to be tuned to the rhythm:
still suck harder when vowels and consonants are filtered but rhythm is preserved.
indifferent when tapes are played backwards so melody is distorted.
Other Evidence:General Sensitivity to Voices
Newborns can discriminate most pure tones, but best discriminations for human speech frequencies (Eisenberg, 1976).
Babies imitate sounds of human speakers, but don’t imitate other sounds (e.g. fridge).
From two weeks, will stop crying if someone speaks to them, but not to a bell
Particular Sensitivity to Mother’s Voice
Children as young as 3 days recognise mother’s voice and discriminate it from the voices of other mothers.
Babies may prefer acoustic characteristics of a speech passage their mother recited while pregnant over something not read (Decasper & Spence, 1986).
Two-way Street?
Parents also may be sensitive to infants’ perceptual needs.
Use very typical intonation patterns in speaking to infants - Infant Directed Speech
slow rate
exaggerated intonation and facial expressions
Intonation reflects meaning (e.g., approval/disapproval)
high fundamental frequency
May assist segregation of speech units
Is IDS a Speech Perception Aid?
Parents may (unknowingly) use IDS to:
assist in the delineation of speech sounds
be cute, warm, fuzzy etc
How might we distinguish?
Burnham (1998) compared IDS with PDS (pet directed speech!)
IDS had different acoustic characteristics.
Developmental Changes in Speech Perception
- Young children are better than adults in distinguishing phonemic contrasts that are not made in their own language.
- By the age of 1 year, children’s speech perception seems to have become specialized for their own language – more like adults.
- Infants are sensitive to the distributional properties of speech they hear, that certain sounds are used together and being more sensitive to words than to nonword sounds.