Sentence Processing Pt1 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Sentential Structure

A

In English: (S)ubject (V)erb (O)bject

  • eg “the dogs chased the cats”
    NP (s)ubj (V) NP (o)bj
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Parsing

A

Dividing a sentence into its grammatical parts and analyzing their syntactic roles

  • collection of structure-building mechanisms and procedures
             S
      ⟋           ⟍     NP                 VP The dogs      ⟋       ⟍
                  V           NP
            chased    the cats
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Incremental Parsing

A

Incrementality: builds the sentential structure and the meaning as the words in the sentence unfold

Outcomes of incremental parsing are NOT always correct
- eg activation of onset competitors
- ham
- hammer, hammock, hamstrings …

The system will have to suppress the incorrect interpretations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Temporary Ambiguity: Written

A

Temporary ambiguity causes a brief misinterpretation of a sentence’s syntax

  • eg “while Anna dressed the baby was in the crib”
  • “while Anna dressed”
    subj verb
  • “while Anna dressed the baby”
    subj verb obj
  • “[while Anna dressed] [the baby was in the crib]”
    [subj verb] [subj verb obj]
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Relative Clause

A

A clause that modifies another noun
- example:

Ambiguity: “the general presented copies of the report was aware of the issue”

Relative clause: “the general who presented copies of the report was aware of the issue”
- who presented… modifies “the general”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Reduced Relative Clause

A

A structure involving a relative clause where certain function words are omitted, thereby resulting in ambiguity

  • “the general presented copies of the report was aware of the issue”
    • missing function word “who”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Garden Path Sentences

A

Sentences that create a temporary ambiguity as they are incrementally processed

Disambiguating Region: the first point at which only the correct interpretation is consistent with the unfolding sentence

  • eg “while Anna dressed the baby was in the crib”
    • processing difficulty (longer reaction time) often observed at the disambiguating region
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Parsing challenges: ambiguity

A

1) Temporary Ambiguity

2) Global Ambiguity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Temporary Ambiguity: Spoken

A

Temporary ambiguity causes a brief misinterpretation of a sentence’s syntax

  • eg. “Pick up the /bi/…“
    • /bi/ “beaker”
    • /bi/ “beetle”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

2) Global Ambiguity

A

Sentences that can be interpreted in multiple ways/have more than one meaning depending on the groupings of words

Attachment Preferences
- eg: “the hooligan damaged the new shop with the fireworks”
- “with the fireworks” attached to:
- noun: “shop with the fireworks”
- verb: “damaged … with the fireworks”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Examining processing difficulties

A

Self-Paced Reading Task:
A behavioural task intended to measure processing difficulty at various points in the sentence

  • participants read through sentences one word or phrase at a time by pressing a key to advance through the sentence
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Self-Paced Reading Task EXAMPLE

A

(1)
(a) the hooligan (b) damaged (c) the new shop (d) with the (E) fireworks

(2)
(a) the hooligan (b) damaged (c) the new shop (d) with the (F) discounts

  • analysis: sentences are divided into regions; reactions times measured for each region
  • region of interest: (E)/(F)
  • IF: longer reading times for (F) “discounts” in (2) than (E) “fireworks” in (1), we can conclude that people have more difficulty parsing the structure in (2) than in (1)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Models of Ambiguity Resolution

A

1) Garden Path Theory

2) Constraint-Based Model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

1) Garden Path Theory

A

Frazier & Fodor (1978)
Two stages:
(a) initial analysis
- identify parts of speech
- parse hierarchical sentential structure
(b) reanalysis
- influence of other info (plausibility, context …)

Garden Path theorists suggest that when we parse linguistic information, we initially build the simplest analysis—one structure, one meaning only
- in English, SVO
- all other info sources later, prompting reanalysis
- avoids syntactic complexity
The parser is able to reserve cognitive resources by using simple structure-building heuristics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Garden Path Theory EXAMPLE

A

Ex: “While Anna dressed the baby was in the crib

Initial Structural Analysis:
- simplest analysis: SVO
- analyze “the baby” as the direct object

Reanalysis:
- “was” = disambiguating region—renders the simple analysis implausible
- use context and meaning at a later time to revise analysis
- “the baby” ≠ direct object
- “the baby = subject of next clause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Garden Path Theory: Minimal Attachment

A

When more than one structure is licensed and consistent with the input, build the structure with the fewest nodes

Pre X-bar structures:
“While Anna dressed the baby
S
⟋ ⟍
Adv S
While ⟋ ⟍
NP VP
Anna ⟋ ⟍
V NP
dressed the baby

NOT:
S
⟋ ⟍
⟋ ⟍
S S
⟋ ⟍ ⟋ ⟍
Adv. S. NP VP
While ⟋ ⟍ the baby |
NP VP V
Anna | was
V
dressed

17
Q

2) Constraint-Based Model

A

MacDonald and colleagues (1994)
Constraint model theorists propose that upon presentation, all possible structures are activated at once
- different info sources (constraints) affect the parser’s decision, such as:
- semantic info
- contextual info
- frequency characteristics
- constraints are heavily biased to choose one interpretation over another
- not all structures lead to garden path effect

18
Q

Constraint-Based Model EXAMPLE

A

Ex: “the dog [walked to the park] wagged its tail happily”
- likely: main verb interpretation for “walked”
- less likely: reduced relative clause for “walked”

Ex: “the treasure [buried in the sand] was never found”
- likely: reduced relative clause for “buried”—there is NO garden path effect
- less likely: main verb interpretation for “buried”

19
Q

Factors affecting processing difficulty

A

1) Thematic Relations
2) Syntactic Frames of Verbs
3) Frequency-based Information
4) Context

20
Q

(1) Thematic Relations

A

Thematic Relations:
The number and roles of participants involved in an event (aka arguments in a sentence) eg:
- run: the runner (1 argument)
- bite: the biter; the bitee (2 arguments)
- send: the sender; the sendee; the item sent
(3 arguments)

Garden Path Theory (GPT): thematic relations CAN’T influence initial analysis

Constraint-Based Model (CBM): thematic relations CAN influence early decisions

21
Q

(1) Thematic Relations
SAME Research Results

A

UNAMBIGUOUS EXAMPLE
The florist sent the flowers — two possible analyses:
- (a) main verb analysis
The florist | sent | the flowers
SUBJ | MAIN V | OBJ
- (b) reduced relative clause
The florist | [sent the flowers]
SUBJ | [RELATIVE CLAUSE (modifier)]

Analysis: both GPT and CBM choose (a)
- GPT: SVO = most sensible structure analysis
- CBM: thematic relations of send:
- the item sent (“the flowers”)—requires that the subject is a person
- the sender (“the florist”) = a good fit for subject in main verb analysis

AMBIGUOUS EXAMPLE
The florist sent the flowers was aware

Analysis: both GPT and CBM: show processing difficulty when ascribing to (a) main verb analysis
- both theories ascribe to main verb analysis and argue processing difficulties arise at “was

22
Q

(1) Thematic Relations
DIFFERENT Research Results

A

DIFFERENT RESULTS:
Ex:The treasure buried in the sand — two possible analyses:

1) Garden Path Theory:
(a) main verb analysis
The treasure | buried | in the sand
SUBJ | MAIN V | LOCATION

GPT: ascribe to (a) main verb analysis (SVO; simplest analysis)

Constraint Based Model:
(b) reduced relative clause
The treasure | [buried in the sand]
SUBJ | [RELATIVE CLAUSE (modifier)]

CBM: ascribe to (b) reduced relative clause analysis; thematic relations of “bury”:
- the one who buries: NP “the treasure”= bad fit for subject in main verb analysis
- the item buried: NP “the treasure”= good fit for subject in relative clause

Research Results
The treasure buried in the sand was lost

GPT: ascribe to main verb analysis
- show processing difficulty
CBM: ascribe to reduced relative clause
- NO processing difficulty
- most people adhere to CBM and are unsurprised by “was”, since thematic roles are considered early-on in parsing

23
Q

(2) Syntactic Frames

A

Syntactic Frames:
Templates that describe the kinds of syntactic elements a verb can take

Examples:
(1) fall
- no element required
- eg: The book fell.

(2) eat
- requires NP
- eg: The nurse ate [the sandwich].
- correct, but vague: The nurse ate

(3) put
- requires NP and PP
- eg: Rosa put [the parcel] [on the table].
- incorrect: Rosa put.

(4) say
- requires a sentential/clausal element
- Raj says [the report is incomplete].
- incorrect: Raj says.

24
Q

(2) Syntactic Frames
Syntactic Frame Ambiguity

A

AMBIGUITY: syntactic ambiguity depending on syntactic frame, creating a bias in parser’s expectations.

Less Bias: some verbs allow for more than one syntactic frame
(1) notice; NP
- Lin noticed [the answer]

(2) notice; Sentential element
- Lin noticed [the answer was incorrect]

More Bias: other verbs are more biased towards one syntactic frame over others (preference)
- NP-bias verbs (eg: accept, repeat, advocate…)

  • S-bias verbs (eg. realize, decide, promise…)

Garden Path Theory:
According to GPT, syntactic frame biases CANNOT influence initial analysis

Constraint-Based Model:
According to CBM, syntactic frame biases CAN influence early decisions

25
Q

(2) Syntactic Frames
Research Results

A

Syntactic Frames Research
Completion Task (Trueswell et al., 1993)
- S-bias verb: “realize”

Lin realized the answer — two possible analyses:

  • GPT adheres to (a) NP analysis
    Lin realized [the answer]
  • CBM adheres to (b) S analysis
    Lin realized [the answer…]

Ex: Lin realized the answer was incorrect
- GPT: ascribe to NP analysis—show processing difficulty
- CBM: ascribe to S analysis—NO processing difficulty

Results for bias on syntactic frames of “realize”:
- NP-analysis 7%
- S-analysis 93%
- no processing difficulty for “was…”
- most participants placed their bets on the
correct frame bias—SUPPORTS CBM

26
Q

(3) Frequency

A

According to CBM, processing difficulties can be affected by the frequency of specific constructions

Ex: “Someone shot the maid of the actress who was standing on the balcony with her husband

Q: who’s husband?
- the maid
(“…shot the maid…who was standing…with her husband”)
- the actress
(“the actress who was standing…with her husband”)

27
Q

(3) Frequency
Attachment Preferences

A

Someone shot the maid of the actress who was standing on the balcony with her husband
Q: who was standing with her husband?

Late Closure (GPT): attach the newly encountered phrase (“who was standing…”) to the most recently encountered info (“the actress”)
- GPT: simplest analysis; most recent phrase=less memory required
- tendency for English speakers (A: “the actress”)

Early Closure (CBM): attach newly encountered phrase (“who was standing…”) to the earlier info (“the maid”)
- CBM: analyze attachment depending on frequency-based info of attachment
- tendency for Spanish speakers (A: “the maid”)

Attachment Predictions: (for Spanish)
- GPT; Late Closure: show processing difficulty
- CBM; Early Closure: NO processing difficulty

28
Q

(3) Frequency
Active/Passive Voice

A

Active Voice Bias: eg entertain
- more likely: Hamlet entertained the guest
- less likely: Hamlet was entertained by the guest

Passive Voice Bias: eg accuse
- less likely: Hamlet accused the guest
- more likely: Hamlet was accused by the guest

GPT: frequency-based info CANNOT influence initial analysis
CBM: frequency-based info CAN influence early decisions

29
Q

(3) Frequency
Ambiguous Sentences

A

Ambiguous sentence: “The suspect accused…”; two analyses:

(a) main verb interpretation (GBT)
- The suspect accused
- SUBJECT MAIN V

(b) reduced relative clause interpretation (CBM)
- The suspect [accused … ]
- SUBJECT relative clause

30
Q

(3) Frequency
Ambiguous Sentences (Research)

A

Prediction: active voice is more associated with main verb analysis; passive voice is more associated with relative clause

Ambiguous Sentences (Research)
eg: “The suspect accused was released

GPT: ascribe to main verb analysis → show processing difficulty
CBM: ascribe to reduced relative clause analysis → no processing difficulty

31
Q

(4) Context

A

Context: written, spoken, or visual information that forms the setting/fortifies the meaning of a linguistic phrase element
- helps build a mental image
- reduces ambiguity

GPT: context does NOT influence initial analysis
CBM: context DOES influence early decisions

32
Q

(4) Context
Linguistic (Written) Context

A

Linguistic Context
Target phrase: The horse raced past the barn fell

Context: “Farmer Bill and Farmer John were racing their horses through the field. Bill rode his horse along the fence, while John raced his horse past the barn. Suddenly, the horse raced past the barn fell”.
- context clarifies who is involved in the setting
- before target phrase, “raced” was used in a more passive way

Two analyses:
(a) main verb analysis
The horse raced …
SUBJ main V

  • GPT: ascribe to main verb analysis → show processing difficulty

(b) reduced relative clause
The horse [raced …
SUBJ relative clause

  • CBM: ascribe to reduced relative clause analysis → NO processing difficulty
33
Q

(4) Context
Visual Context/Early Eyetracking
(Tanenhaus et al. 1995)

A

Visual Context
Target Phrase: “Put the apple on the towel in the box

(A) 🍎🧣 (B) 🧣
(C) ✏️ (D) 📦

Early Eyetracking Research (Tanenhaus et al. 1995)
- presented four objects:
(A) 🍎🧣 (B) 🧣
(C) ✏️ (D) 📦
- Q: how likely is it for listeners to look at the empty towel? (looking for garden path effects)
- eg: 🍎🧣 → 🧣 (X)

34
Q

(4) Context
Manipulation 1: Ambiguous/Unambiguous
(Tanenhaus el al. 1995)

A

Manipulation 1: ambiguous/unambiguous context

Ambiguous: “Put the apple on the towel in the box
- expect more looks to empty towel
- eg: 🍎🧣 → 🧣 (X)

Unambiguous: “Put the apple that’s on the towel in the box
- “that’s” clarifies the empty towel is not the final destination
- 🍎🧣 → 📦 (✓)

35
Q

(4) Context
Manipulation 2: one referent/two referents
(Tanenhaus el al. 1995)

A

Manipulation 2: one referent/two referent context
“Put the apple on the towel in the box”
Q: how likely is it for listeners to look at the empty towel depending on the context?

One referent: only one referent (apple) in the scene
(A) 🍎🧣 (B)🧣
(C) ✏️ (D) 📦
- less context; more likely to look at (B)🧣

Two referent: two referents (apples) in the scene
(A) 🍎🧣 (B)🧣
(C) 🍎 (D) 📦
- extra modifier (A) (adds context)—🍎🧣not 🍎
- less likely to look at (B)🧣

36
Q

(4) Context
One-referent gaze patterns; ambiguous/unambiguous

A

Ambiguous: “Put the apple on the towel in the box
(A) 🍎🧣 (B) 🧣
(C) ✏️ (D) 📦

Gazes: start in middle, look to (A)🍎🧣, then to (B)🧣, back to (A)🍎🧣, then to (D)📦
- garden path effects at 1500ms

Unambiguous: “Put the apple that’s on the towel in the box
(A) 🍎🧣 (B) 🧣
(C) ✏️ (D) 📦

Gazes: start in middle, look to (A)🍎🧣 then to (D)📦
- no garden path effects

37
Q

(4) Context
Two-referent gaze patterns; ambiguous/unambiguous

A

Ambiguous: “Put the apple on the towel in the box
(A) 🍎🧣 (B) 🧣
(C) 🍎 (D) 📦

Unambiguous: “Put the apple that’s on the towel in the box
(A) 🍎🧣 (B) 🧣
(C) 🍎 (D) 📦

Ambiguous and Unambiguous
Gazes: start in middle, look to (C)🍎, then to (A)🍎🧣, then to (D)📦
- no difference in eye gazes patterns between conditions
- no garden path effects (no looks to🧣)

38
Q

(4) Context
Summary of (Tanenhaus et al. 1995)

A
  • Proportion of trials with eye movements to incorrect destination
  • more looks to incorrect destination in one-referent ambiguous condition than two-referent ambiguous condition
  • eye movements showed no evidence of garden path difficulty when the context is appropriate to the structure

Visual context helps clarify ambiguity, allowing processing earlier on in the sentence and reducing garden path effects