Sentence Processing Pt1 Flashcards
Sentential Structure
In English: (S)ubject (V)erb (O)bject
- eg “the dogs chased the cats”
NP (s)ubj (V) NP (o)bj
Parsing
Dividing a sentence into its grammatical parts and analyzing their syntactic roles
- collection of structure-building mechanisms and procedures
S ⟋ ⟍ NP VP The dogs ⟋ ⟍ V NP chased the cats
Incremental Parsing
Incrementality: builds the sentential structure and the meaning as the words in the sentence unfold
Outcomes of incremental parsing are NOT always correct
- eg activation of onset competitors
- ham…
- hammer, hammock, hamstrings …
The system will have to suppress the incorrect interpretations
Temporary Ambiguity: Written
Temporary ambiguity causes a brief misinterpretation of a sentence’s syntax
- eg “while Anna dressed the baby was in the crib”
- “while Anna dressed”
subj verb - “while Anna dressed the baby”
subj verb obj - “[while Anna dressed] [the baby was in the crib]”
[subj verb] [subj verb obj]
Relative Clause
A clause that modifies another noun
- example:
Ambiguity: “the general presented copies of the report was aware of the issue”
Relative clause: “the general who presented copies of the report was aware of the issue”
- who presented… modifies “the general”
Reduced Relative Clause
A structure involving a relative clause where certain function words are omitted, thereby resulting in ambiguity
- “the general presented copies of the report was aware of the issue”
- missing function word “who”
Garden Path Sentences
Sentences that create a temporary ambiguity as they are incrementally processed
Disambiguating Region: the first point at which only the correct interpretation is consistent with the unfolding sentence
- eg “while Anna dressed the baby was in the crib”
- processing difficulty (longer reaction time) often observed at the disambiguating region
Parsing challenges: ambiguity
1) Temporary Ambiguity
2) Global Ambiguity
Temporary Ambiguity: Spoken
Temporary ambiguity causes a brief misinterpretation of a sentence’s syntax
- eg. “Pick up the /bi/…“
- /bi/ “beaker”
- /bi/ “beetle”
2) Global Ambiguity
Sentences that can be interpreted in multiple ways/have more than one meaning depending on the groupings of words
Attachment Preferences
- eg: “the hooligan damaged the new shop with the fireworks”
- “with the fireworks” attached to:
- noun: “shop with the fireworks”
- verb: “damaged … with the fireworks”
Examining processing difficulties
Self-Paced Reading Task:
A behavioural task intended to measure processing difficulty at various points in the sentence
- participants read through sentences one word or phrase at a time by pressing a key to advance through the sentence
Self-Paced Reading Task EXAMPLE
(1)
(a) the hooligan (b) damaged (c) the new shop (d) with the (E) fireworks
(2)
(a) the hooligan (b) damaged (c) the new shop (d) with the (F) discounts
- analysis: sentences are divided into regions; reactions times measured for each region
- region of interest: (E)/(F)
- IF: longer reading times for (F) “discounts” in (2) than (E) “fireworks” in (1), we can conclude that people have more difficulty parsing the structure in (2) than in (1)
Models of Ambiguity Resolution
1) Garden Path Theory
2) Constraint-Based Model
1) Garden Path Theory
Frazier & Fodor (1978)
Two stages:
(a) initial analysis
- identify parts of speech
- parse hierarchical sentential structure
(b) reanalysis
- influence of other info (plausibility, context …)
Garden Path theorists suggest that when we parse linguistic information, we initially build the simplest analysis—one structure, one meaning only
- in English, SVO
- all other info sources later, prompting reanalysis
- avoids syntactic complexity
The parser is able to reserve cognitive resources by using simple structure-building heuristics
Garden Path Theory EXAMPLE
Ex: “While Anna dressed the baby was in the crib”
Initial Structural Analysis:
- simplest analysis: SVO
- analyze “the baby” as the direct object
Reanalysis:
- “was” = disambiguating region—renders the simple analysis implausible
- use context and meaning at a later time to revise analysis
- “the baby” ≠ direct object
- “the baby = subject of next clause
Garden Path Theory: Minimal Attachment
When more than one structure is licensed and consistent with the input, build the structure with the fewest nodes
Pre X-bar structures:
“While Anna dressed the baby
S
⟋ ⟍
Adv S
While ⟋ ⟍
NP VP
Anna ⟋ ⟍
V NP
dressed the baby
NOT:
S
⟋ ⟍
⟋ ⟍
S S
⟋ ⟍ ⟋ ⟍
Adv. S. NP VP
While ⟋ ⟍ the baby |
NP VP V
Anna | was
V
dressed
2) Constraint-Based Model
MacDonald and colleagues (1994)
Constraint model theorists propose that upon presentation, all possible structures are activated at once
- different info sources (constraints) affect the parser’s decision, such as:
- semantic info
- contextual info
- frequency characteristics
- constraints are heavily biased to choose one interpretation over another
- not all structures lead to garden path effect
Constraint-Based Model EXAMPLE
Ex: “the dog [walked to the park] wagged its tail happily”
- likely: main verb interpretation for “walked”
- less likely: reduced relative clause for “walked”
Ex: “the treasure [buried in the sand] was never found”
- likely: reduced relative clause for “buried”—there is NO garden path effect
- less likely: main verb interpretation for “buried”
Factors affecting processing difficulty
1) Thematic Relations
2) Syntactic Frames of Verbs
3) Frequency-based Information
4) Context
(1) Thematic Relations
Thematic Relations:
The number and roles of participants involved in an event (aka arguments in a sentence) eg:
- run: the runner (1 argument)
- bite: the biter; the bitee (2 arguments)
- send: the sender; the sendee; the item sent
(3 arguments)
Garden Path Theory (GPT): thematic relations CAN’T influence initial analysis
Constraint-Based Model (CBM): thematic relations CAN influence early decisions
(1) Thematic Relations
SAME Research Results
UNAMBIGUOUS EXAMPLE
The florist sent the flowers — two possible analyses:
- (a) main verb analysis
The florist | sent | the flowers
SUBJ | MAIN V | OBJ
- (b) reduced relative clause
The florist | [sent the flowers]
SUBJ | [RELATIVE CLAUSE (modifier)]
Analysis: both GPT and CBM choose (a)
- GPT: SVO = most sensible structure analysis
- CBM: thematic relations of send:
- the item sent (“the flowers”)—requires that the subject is a person
- the sender (“the florist”) = a good fit for subject in main verb analysis
AMBIGUOUS EXAMPLE
The florist sent the flowers was aware…
Analysis: both GPT and CBM: show processing difficulty when ascribing to (a) main verb analysis
- both theories ascribe to main verb analysis and argue processing difficulties arise at “was”
(1) Thematic Relations
DIFFERENT Research Results
DIFFERENT RESULTS:
Ex:The treasure buried in the sand — two possible analyses:
1) Garden Path Theory:
(a) main verb analysis
The treasure | buried | in the sand
SUBJ | MAIN V | LOCATION
GPT: ascribe to (a) main verb analysis (SVO; simplest analysis)
Constraint Based Model:
(b) reduced relative clause
The treasure | [buried in the sand]
SUBJ | [RELATIVE CLAUSE (modifier)]
CBM: ascribe to (b) reduced relative clause analysis; thematic relations of “bury”:
- the one who buries: NP “the treasure”= bad fit for subject in main verb analysis
- the item buried: NP “the treasure”= good fit for subject in relative clause
Research Results
The treasure buried in the sand was lost …
GPT: ascribe to main verb analysis
- show processing difficulty
CBM: ascribe to reduced relative clause
- NO processing difficulty
- most people adhere to CBM and are unsurprised by “was”, since thematic roles are considered early-on in parsing
(2) Syntactic Frames
Syntactic Frames:
Templates that describe the kinds of syntactic elements a verb can take
Examples:
(1) fall
- no element required
- eg: The book fell.
(2) eat
- requires NP
- eg: The nurse ate [the sandwich].
- correct, but vague: The nurse ate
(3) put
- requires NP and PP
- eg: Rosa put [the parcel] [on the table].
- incorrect: Rosa put.
(4) say
- requires a sentential/clausal element
- Raj says [the report is incomplete].
- incorrect: Raj says.
(2) Syntactic Frames
Syntactic Frame Ambiguity
AMBIGUITY: syntactic ambiguity depending on syntactic frame, creating a bias in parser’s expectations.
Less Bias: some verbs allow for more than one syntactic frame
(1) notice; NP
- Lin noticed [the answer]
(2) notice; Sentential element
- Lin noticed [the answer was incorrect]
More Bias: other verbs are more biased towards one syntactic frame over others (preference)
- NP-bias verbs (eg: accept, repeat, advocate…)
- S-bias verbs (eg. realize, decide, promise…)
Garden Path Theory:
According to GPT, syntactic frame biases CANNOT influence initial analysis
Constraint-Based Model:
According to CBM, syntactic frame biases CAN influence early decisions