Grammar Learning A Flashcards
Act-out paradigm
Demonstrate/explain a novel verb and then ask the kid to act it out
Kid does the action well when an animal does it, but gets it backwards when
the book should be doing it instead of the sheep
Around age 2.5, kids were just above chance
Much better at age 3.5 - English kids were almost perfect
By age 4.5, English speaking generally
got it right
Using the word order of English even though it goes against real world
knowledge
English very rarely uses passive voice in the spoken word
Transitive construction
A type of sentence where a verb is related to at least two nouns
Word order
A doer and a done-to
Kids do make these errors - over-generalization
Intransitive construction
A type of sentence in which a verb in related to only one noun
The noun is characteristically an agent
In English the noun comes before the verb
The man thinks.
Ditransitive construction
A sentence in which a verb is related to at least three nouns
Giver, recipient, given
Word order
Verb Island Hypothesis
Akhtar and Tomasello
Children’s early language is organized and structured around individual verbs
and other predictive terms
No evidence of abstract grammatical categories
Every verb is an island
20 children, 2.75 and 3.75 years
Children are shown many models of a novel transitive action and told
This is dacking.
Given toys and asked
Make Cookie Monster dack Big Bird.
Both ages could act out familiar verbs
3.75 year olds are really good with the unfamiliar verbs
Kids’ knowledge of word order at 2.75 years is still specific to particular
verbs and not generally applied
Preferential looking task for verb island hypothesis
Gertner et. al
The duck is gorping the bunny. Find gorping.
Show two pictures.
At what age are kids good at this task?
21 month olds were at 70%
“Diary” studies
Researcher is the caregiver
Write down interesting new things children do in a diary
Limitations
‣ Report momentous accomplishments more than mistakes
‣ Who’s to say what a momentous event is
‣ Smaller things are not noticed, or are not noticed as important for
months
‣ Only one or two children per study
‣ Only children of a certain socioeconomic class/children of researchers
are studied
‣ Bias - the researcher is coming with specific expectations
(People perceive bias even if it isn’t there)
‣ Hard to replicate the circumstances under which a child learned
something
Advantages ‣ Longevity ‣ Outside of artificial lab setting ‣ Caregiver might over interpret, but they will understand their kids better than a transcriptionist
Naturalistic data analysis
Large collections of transcriptions of kids interacting with the primary
caregiver
Recent data sets include recordings of children for an hour or more
everyday from before their second birthday until age four
Video and (transcribed) audio
Limitations
‣ A lot of what kids do does not involve language - hard to get language
development information - need lots of recordings
‣ If you have an hour of dense language, the kids might have been
manipulated by the parents to say things (hard to be natural) - lots of
questions
‣ Sometimes hard to understand the kid (for the transcriptionist) - might
be misinterpreted
The kid copies what the dad says - ‣ he’s asking leading questions
Advantages
‣ Objective recording
‣ Not selectively reported
‣ Can have data collected from non-experts’ (not researchers’) children
Pivot grammar
Braine
Two word stage
Pivots and objects
P1 + O - more juice, more milk
O + P2 - juice gone, mommy gone
O + O - ball table, mommy sock
Basic structures, they can insert elements - some rules of what can be put
where, which pivots come before and which come after
Built around specific words
Minimal/conservative generalization
Problems
‣ Doesn’t explain why
‣ Hard to falsify - vague, happens to fit, but can’t tell if they are not doing
something more complicated
Semantic Relations
Roger Brown
Two word stage
Less conservative generalization
Kids’ knowledge is not built around certain words, but is not completely
general either
Based upon semantic relations, narrow categories
‣ Agent + action (Daddy sit)
‣ Agent + object (Mommy sock)
How far do they generalize? Mommy, Daddy, cat are not all the same, even
though they are all agents
Agent vs entity (Dad floor, ball floor)?
Artificial Grammar
Gomez and Gerken
Create a completely artificial grammar, expose kids to it, and see what they’ve
learned
Play a string of sound generated by artificial grammar
1-2 minute exposure to one of two grammars (made of same words)
12-month-olds learn these grammars after very brief exposure
Test children’s ability to distinguish previously unheard sequences of sounds
generated by that grammar from those sounds not generated by that
grammar
Head turn preference
Listen longer to new strings generated by the grammar they were just
exposed to than to strings generated by the other grammar
Prefer to listen to new words in their language rather than new words
from a language they’ve never heard before
What does this tell us about language learning?
Children can learn grammars using statistical probabilites
But…
Real language is complicated
Real language is productive
We learn categories, not all possible word combinations
Distributional Model of Grammar Experiment
Gomez and Lakusta
A words paired with X (two-syllable) words, B words paired with Y (onesyllable)
words
All words are made up
One of two languages, the above rules or the reverse
Exposed kids to sequences of sounds that followed the rules
12 month olds
Do kids notice/learn this pattern?
After being trained, they were given unheard strings that followed the rule,
as well as strings that don’t follow the rules
In head turn test, the kids listened longer to strings that followed the rules
they had just learned - new combinations with X and Y words that they
never heard during training
Distributional model of grammar - kids know the grammar of a language by
learning that words that occur in similar contexts are similar
This study suggests that kids are doing this - recognizing that there are
rules
Will kids learn that there is a correspondence between distributions and
meaning?
Laney & Saffran
Train kids in same language as in the Gomez/Lakusta Experiment
Teach the kids words - each category was one type
Two syllable words were animals, and one syllable words were vehicles
At test, show them two pictures and play a sequence of words
Where does the kid look?
Found that kids would start to look to the picture that was part of the
category that was shown in training
Still an artificial relationship, but different forms of words do relate to
grammatical categories in English
Representing Meaning through Collection of Words
Bag of words approach
‣ Scrambled words from various documents
‣ What parts of the meaning of documents can you capture through an
unordered collection of words?
- What the document is about
- Things mentioned in the document
• Documents on similar topics contain similar words
Don’t need to know things about natural language, just use word
count
Vector Space/Semantic Space/Distributional Model
Table to vector
Each entry is a dimension
The word “film” is a dimension, count is 24, so coordinate of film is
24
The number of dimensions is the number of words that are being counted
Similarity between two documents as proximity in space correlates with similarity
Can be done completely automatically
- Euclidean distance - walking distance from point a to b
- Cosine similarity - angle between them