Language Flashcards
psycholinguistics
the study of how we learn, understand, and produce language
why does language distinguish humans from other animals?
animals have fixed sets of communication (humans can combine the same words to produce novel complex thoughts)
while other animals have vocal or motor capabilities to produce language, they lack the cognitive abilities to generate novel language
functions of language
sharing complex thoughts, emotions, plan the future, organize into groups, transfer information across generations
behaviorist view of language (nurturist view)
language is learned through conditioning and reinforcement (correct/incorrect feedback) and modeling, language is stimulus-dependent (external stimulus required)
universal grammar
contains basic scaffolding of syntax without details (which need to be learned), this is innate in every human
gene FOXP2
responsible for universal grammer
mutations result in developmental verbal apraxia which affects the ability to pronounce syllables and words
poverty of the stimulus
the rules of grammar are ambiguous with just examples, so language cannot be learned only using examples/conditioning
evidence for poverty of the stimulus
adults adopt a grammatically insufficient version of a language when they move elsewhere (pidgin), but their children combine the pidgin and their country’s language to form new grammar (creole) = not learned because it is a new language
deaf isolates (not exposed to traditional sign language) develop their own kind of sign language
evidence for a naturist view of language
universal grammar, poverty of the stimulus
language acquisition develops in the same way for all infants (cooing 0-3 months, babbling 4-8 months, single words 8-12 months, two word phrases 1-2 years, telegraphic speech 2-3 years, complex speech 3-4 years)
child-directed/infant-directed speech
speech directed to a child
motherese/parentese (sing-song, exaggerated vowels, repetition) helps children identify beginnings and ends of sentences and draws attention to important concepts - accelerated language learning
phonemes
smallest unit of speech (sounds) that don’t have meaning, but can change the overall meaning
morphemes
smallest unit of meaningful speech
resolving phonological ambiguity (identifying phonemes)
context (people were unable to identify a single word when taken out of context)
phonemic restoration effect: brain fills in the missing phoneme based on expectations
McGurk effect: brain uses mouth movements (visual signal) which move characteristically based on sounds
how to segment speech into individual morphemes?
statistically: we encode the frequency with which sounds occur together
lexical processing
matching speech units to meaning
homophones
words which sound the same but have different meanings
homographs
words which are spelled the same but have different meanings
lexical decision task (LDT) and results
a string of letters is presented and Ps must decide if it is a word
RT is faster when words that are related appear together
RT is faster when the words are common
resolving lexical ambiguity (lexical processing)
using context (within a certain room, environment) to decipher ambiguity of individual words that could have different meanings - we first identify meaning as the most frequent usage
does context prevent other meanings of individual words from being activated in lexical processing?
brain briefly considers all meanings before setting on the context-dependent one (RT on a LDT was faster for both context and non-context if words presented within 200ms of being primed with biased or unbiased sentences - cross-modal priming task)
parsing
breaking up a sentence into its constituent parts, can lead to ambiguity because we hear sentences incrementally
garden-path sentence
sentence that leads to incorrect parsing, to individuals take the wrong path which leads to a dead end (we have to reinterpret the clause when we get to the end)
clause, subject, predicate
clause expresses a full idea of a subject doing something which is indicated by the predicate
syntax-first approach to parsing
we parse based on syntax without considering the meaning of words beyond the type (noun or verb?)
late closure
we attach words to the sentence we’re currently processing rather than assuming a new phrase
“the man who whistles tunes pianos” - we think ‘whistles’ is the predicate so assume tunes is a noun, but we are forced to re-parse because ‘pianos’ is a noun = ‘tunes’ must become the predicate
evidence against syntax-first approach
we do take semantics into consideration: “the defendant examined by the lawyer” vs. “the evidence examined by the lawyer” - a defendant can examine something so re-parsing is needed vs. evidence cannot examine something so no re-parsing
what do we use to help us parse sentences?
visually-available stimuli (‘put the apple on the towel in the box’)
prosody (many potential layers of meaning depending on intonation, not grammar - ‘I never said she took the money’), punctuation helps indicate prosody to disambiguate sentences and prevent incorrect parsing
discourse processing
ability to understand language that is at least several sentences long
involves integration of STM and LTM
depends on anaphoric inference and causal inference, and pre-existing knowledge (inferring meanings)
anaphoric inference
guess about which word in a first sentence (antecedent) is being referred to in a second sentence (like a pronoun)
causal inference
assumption that something mentioned at one stage leads to something later on
depends on general knowledge of how the world works
backward/deductive/necessary inference
referring to previous information to infer something that is necessary to understanding
takes processing time, so sentences which require it have longer RT - indicative of online inference
elaborative inference
adding information that isn’t essential for understanding of a text - done in a way that is consistent with expectations
online vs. offline inference
online: during reading or listening
offline: during consolidation or retrieval
instrumental inference
tool used for a task is inferred even if it’s not required to understand the meaning
a type of elaborative inference
does elaborative inference occur online or offline?
can occur online if a sentence is rich enough in information so as not to require backward inference (“leisurely pace” - implies nothing vs. “hole-in-one” - implies golf)
neurolinguistics
relationship between linguistic behaviour and the brain
arcuate fasciculus
bundle of fibers that connects Broca’s and Wernicke’s (it’s absent in other species, so may be important for humans’ linguistic capabilities)
right hemisphere’s role in language
higher-level discourse processing, elaborative processing
language relativity and evidence
the language we speak affects other areas of cognition
people are better able to distinguish between colours when their language has more than one word for that colour
nativism
linguistic universalists: differences among languages are superficial and don’t affect other areas of cognition
natural language processing (NLP)
subfield of AI concerned with making machines that can produce and understand language; emotional tone, making summaries, engaging in conversations with humans
Turing test
human conversing with a human and a robot and must decide which is the robot
sequence-to-sequence learning
taking in a string of text and produce a string of text in response - machines don’t learn the rules of syntax (ChatGPT)
definition of language
shared symbolic system for purposeful communication
how is language affected by the environment?
morphology (complexity) decreases with languages spoken by more people
lexical tones are determined by climate (tonal languages that use pitches as meanings less common in cold countries because they lack vocal control because of cold air)
how are language and gender style related?
countries with gendered language experience more gender inequality
women tend to use “we” and more adjectives, more of a reverse accent (upspeak)
aphasia
impaired language function from a brain injury (also from dementias)
Broca’s aphasia
non-fluent/expressive aphasia - cannot produce speech, but can understand it
speech is halting (nouns and verbs), writing also affected, amount of tissue damaged is correlated to amount of impairment
patient Tan
could only speak one syllable but tried to communicate using gestures, tone, inflection - discovery of Broca’s aphasia
Wernicke’s aphasia
fluent - speech is produced but has no meaning (word salad)
uses nonwords and made-up words (paraphasias are common)
verbal paraphasia
substituting a word with another semantically-related one
phonemic paraphasia
swapping or adding speech sounds (sad cralad to mean crab salad)
neologisms
using made-up words (mansplain)
can be culturally-shared, but in Wernicke’s they’re not, so aren’t communicating meaning
paraphasia
misusing words, common in Wernicke’s
conduction aphasia
damage to the arcuate fasciculus = disconnection between understanding and producing speech and cannot repeat speech
load-dependent: deficit increases the more complex the sentence is
lateralization of language
left for language (not fully understood), right for broader aspects of language (prosody and pitch to convey mood, meaning, discourse segmentation, gestures)
classic model of language
dorsal pathway from A1 - speech production and movements
ventral pathway from A1 - speech comprehension
may underspecify how language is represented in the brain (damage to Broca’s doesn’t always result in an aphasia and damage to other parts of the brain can result in the same deficits)
principles of the innateness hypothesis
grammar and syntax are separate from meaning (since we can create grammatical sentences that don’t make sense)
language acquisition device supports principles of how to learn a language
our innate language skills are rules of grammar (universal grammar) that need to be adjusted for the specific language
language acquisition device (LAD)
abstracted entity that supports language (hardwired into our brain, controls principles of learning a language)
support for the innateness hypothesis
convergence (children are exposed to different learning situations, but converge on the same grammar)
uniformity: children go through the same learning stages in the same order
poverty of stimulus argument (linguistic environment is too deficient for children to learn through reinforcement - they hear a small set of infinite possibilities, doesn’t have opportunities to learn from mistakes)
issues with the poverty of stimulus argument
what information is innate?
how to disprove the argument?
how to determine what linguistic information is available to a child?
adults’ reformulate speech based on grammar - children extract regularities to form rules
types of ambiguity in language
phonological (within a sound)
lexical (within a word)
syntactic/parsing (within a sentence)
constraint-based models of parsing
we use constraints to resolve ambiguity: semantic and thematic context, expectation, frequency (opposition to syntax-first)
surface dyslexia
impaired at reading irregular words because reading happens letter-by-letter (they have to sound out, cannot compare to mental dictionary)
phonological dyslexia
impaired at reading nonwords or made-up words because they have to compare words to a lexicon (cannot sound out letter-by-letter)
dual route model of reading
mental dictionary (whole-word reading) and grapheme-phoneme conversion (sounding out)
language of thought hypothesis
nativist view; medium of thought is an innate mentalese, not language (which is why children can think when they don’t have language) - cannot be tested
Sapir-Whorf hypothesis
linguistic determinism (strongest view): person’s thoughts are determined by language (people from different languages have a fundamentally different view of the world)
weaker view that thoughts are influenced by language (colour studies in which language shaped colour memory)