Week 10: Language Comprehension Flashcards
Sentence comprehension: 2 main levels of analysis (beginning with P)
Parsing
Pragmatics
Parsing
Pulling a sentence apart - Analyse the syntactic/grammatical structure of sentences - when are different kinds of info used?
Syntactical structure - analyse the rules (like word order) for the formation of grammatical sentences
Pragmatics
Sentence meaning.
Sentence meaning - analyse the intended, as apposed to literal, meaning (e.g. irony/ sarcasm)
Four possibilities of parsing:
- Syntactic analysis occurs before semantic analysis (explore grammatical structure before exploring meaning)
- Semantic analysis occurs before syntactic analysis (explore meaning before analyse grammatical structure)
- Syntactic and semantic analyses occur together (analysed together)
- Syntax and semantics are closely related
Parsing: ambiguous sentences
Reveal information about the parsing process
Ambiguity at a GLOBAL LEVEL:
A whole sentence can have two or more meanings (the whole sentence)
E.g. ‘kids make nutritious snacks’
Ambiguity at a LOCAL LEVEL
Various meanings possible at some point when parsing (localised in the sentence)
E.g. ‘The old men and women sat on the bench’
Prosodic cues
- resolve ambiguity & facilitate understanding
- Prosodic cues include
– Stress (or accent)
– Pauses
– Intonation (rise/fall)
– Rhythm
– Word duration
- Prosodic cues include
e.g. a pause would be useful in the sentence ‘the old men and women sat on the bench’
Models of parsing
- Two-stage, serial processing models
e.g. Garden path model (Fraizier & Rayner, 1982) - One-stage, parallel processing models
e.g. Constraint-based model (MacDonald et al, 1994)
Garden path model (led down the garden path)
- Misleading content/structure at the beginning of a sentence
- Reader enticed toward incorrect interpretation (i.e. led-down the garden path!)
- Must retrace mental footsteps to find understandable alternativeDetected by recording eye movements
– Tells us where/when a reader has gone wrong and is re-reading a sentence
E.g. The detectives examined…
…by the reporter…
…revealed the truth about the robbery
Garden Path Model Assumptions
Two-stage, serial processing model (one step after the other)
Makes the following assumptions
– One, syntactical structure considered
– Semantics (meaning) not involved initially
– Simplest syntactical structure chosen using MINIMAL ATTACHMENT and LATE CLOSURE
– If a sentence is incompatible with additional semantic information, interpretation revised (2nd stage)
(so SYNTACTICAL before SEMANTIC - need to understand structure before gaining meaning)
Garden path: Minimal attachment
– Prefer grammatical structure producing fewest nodes preferred
– Nodes = major parts of sentence (e.g. nouns/verbs)
The girl knew the answer by heart
(brain likes this sentence as noun and verb instantly connect ‘knew’ and ‘answer’)
The girl knew the answer was wrong
(given additional info about the answer so requires more cognitive effort, against the idea of minimal attachment)
(also known by tracking eye movement)
Garden path: Late closure
New words encountered are attached to current phrase if grammatically permissible
Since Jay always jogs a mile it seems like a short distance to him
§ Attach the word jogs and mile together
Since Jay always jogs a mile seems like a short distance to him
§ Doesn’t make sense to connect jogs and mile, if pause after jogs.
Include ‘a mile’ in the first clause
Garden path model: Strengths
- provides a simple account
- use of principles reduces processing demands
Garden path: Weaknesses
- Assumption that we do not use meanings of words initially is inconsistent with some evidence
– Do not always adhere to principles
– No definitive test of model
– Does not account for difference in languages that have a preference for early closure (e.g. Spanish, French) (anglocentricities)
Model of parsing 2: Constraint-based model (Macdonald et al)
One-stage, parallel processing model
Constraint based model: Assumptions
- One-stage, parallel processing model– All sources of information (syntax, semantics, context) are available to parse
– Constrain the number of possible interpretations
– Competing syntactic analyses of a sentence activated at the same time
– Structure receiving most support from constraints is highly activated (the one we tend to go to)
(analysing everything together)
Constraint-based model: Strengths
– Efficient that use all relevant information from the outset when interpreting sentences
– Able to account for more than one syntactic analysis at a time
CBM: Weaknesses
Unlike the garden path model, it fails to make precise predictions about parsing
Unrestricted race model: Assumptions
- Combines (best?) aspects of both garden path and constraint-based models
- Makes the following three assumptions
- All sources of information used to identify syntactic structure of a sentence (constraint based model)
- All other structures are ignored, unless favoured structure disconfirmed by subsequent information
- If a chosen structure discarded, reanalysis undertaken before another structure chosen (garden path model)
Next topic: Pragmatics
- The study of intended (not literal) meaning
- Figurative language = not to be taken literally
– Metaphor
– Irony
– Idiom
Understanding metaphors: two differing models
- Standard pragmatic model (Grice, 1975)
2. Prediction model (Kintsch, 2000)
Standard prediction model (3 stages)
- Understanding metaphorical statements involves three stages
- Literal meaning first accessed
- Reader/listener decides if literal meaning makes sense
- If literal meaning inadequate, search for suitable non-literal meaning
- Predicts metaphorical meaning accessed more slowly than literal one
– But, some metaphors understood rapidly
Common ground
- Need to adopt the speaker’s perspective to comprehend what they are saying
- Common ground = shared knowledge between speaker and listener
– Work together to ensure mutual understanding - Very attentionally demanding
- Common ground = shared knowledge between speaker and listener
Prediction model (2 stages)
- Latent semantic analysis - Represents the meanings of words based on relations with other words
e.g. in the phrase ‘all lawyers are sharks’ you can find meaning in words that link to shark - Construction-integration - Use information from first stage to construct interpretation
e.g. so once have words linking to sharks, not all of them apply (e.g. fins) so can link with predicted meaning, e.g. aggressive, vicious would make sense.
When there is no common ground? egocentric heuristic
- Interpretation based on our own knowledge, rather than that shared with speaker
– Use egocentric heuristics - Often the cause of misunderstanding between listener and speaker
Next part of lecture: Inferences (3 types)
- Logical
- Bridging (or backward)
- Elaborative
Logical inferences
Depends on the meaning of words - apply meaning on words to fill in blanks
e.g. ‘jill has very fair skin’. inference = jill is pale
Bridging inferences
Establish coherence between current part of text and preceding text
E.g. ‘she had sunburn on monday’ inference = jill forgot to use suncream
A type of bridging inference: Causal inferences
Bridging inferences: Causal inferences:
decipher causal relship between current sentence and the previous sentence. (uses context and prior knowledge)
TWO STAGES of forming inference:
- BONDING: automatic activation of words from preceding sentence
- RESOLUTION: ensures interpretation is consistent with contextual information
Ken drove to london yesterday.
The car kept overheating.
(Previous sentence gives context to second)
Elaborative inferences
Add details to text using knowledge to expand on information
e.g. ‘She forgot to use sunscreen’ Inference = Jill will be sunburnt
Making inferences: Schema theory
- Schemas can be defined as
– ‘Packets’ of information about the world, events, people, actions etc.- Contain information needed to understand language
- Allow the formation of expectations
– Used to infer typical sequence of events
– Make the world predictable, as our expectations are generally confirmed (but not always!)
he procedure is quite simple. First, you arrange items into different groups. Of course one pile may be sufficient depending on how much there is to do. If you have to go somewhere else due to lack of facilities that is the next step; otherwise, you are pretty well set. It is important not to overdo things. That is, it is better to do too few things at once than too many
Washing clothes
Schema theory
- Hearing the passage in absence of a title
- incomprehensible and not well-remembered- But if title provided before
- easy to understand and greater recall
- But if title provided before
Scheme knowledge provided by the title helped with comprehension
- if presented after the passage, but before recall, less information remembered (because automatically go to own interpretation of washing/ schema of washing)
(Bransford & Johnson)
Schema theory strengths:
Semantic knowledge helps with text comprehension and general understanding
- account for errors and distortions
Weaknesses:
Schema theories are hard to test
When/ how schemas are used is unclear
Exaggerate how error prone we are