L11-probability Flashcards
Structuralism
Language as a structural system, consisting of arbitrary relationships between concepts and their words/sounds. The connection between a word and its meaning is only determined by convention (1900s:Ferdinand de Saussure)
Behaviorism
Language is learned by behaviour and reinforcement. This is how we connect objects with their sounds/words. (1960s: Skinner)
Generativism
An innate, internal system of abstract language rules that governs our language competence. (Chomsky)
Modularity
Language is modular and its’ computation is automatic. (Fodor)
Transducer (modularity)
We capture sounds from environment through our hearing/sensing ability. The environmental signals are then transformed in a way that they can be used by the language module.
Language module
It takes information from transducer and decides which sounds are relevant for language (processes language). The language module is domain specific, automatic, speedy, cannot be accessed by other systems, has a neural specific structure, same developmental stages in all humans and only its output is accessible.
Central system
The central system operates on modular information. The output of the language module can be used by other systems as well.
Probabilistic approach
Probability is added to existing (symbolic & innate) theories for language. The probabilistic approach is applicable to language processing, acquisition and representation.
Representation & probability
Probability can be added to phonology, meaning that how a phoneme behaves in a language system and acts in it is determined by its probability in a specific language (e.g.: some phonemes are more probable than others in a language).
Probability can also be added to morphology, meaning that how a morpheme affects a language system is determined by the probability of its occurrence (e.g.: regular forms of past tense are more common than irregular forms).
Language processing & probability
Language processing involves generating and interpreting probabilities. It can be used for serial and parallel processing models.
Serial processing & probability
Adding probability to serial processing by using the Bayesian approach
Bayesian approach
Probability is updated based on previous syntactic structures and their words. Every new piece of information updates the system.
Bottom-up update
A speech signal provides you with information to extract meaning from it. –>use present information to analyze sentences
Top-down update
Use a language template to extract a speech signal.
use previous knowledge to analyze received information
Reversed probability
We use both top-down and down-up updating and information is shared between both updating systems.
Syntax trees
Hierarchical representation of sentence structure consisting of nodes with probabilities
Probabilistic parsing
Estimating likelihood of different trees given a sentence and a probabilistic model.
Principle of minimal attachment
Listener prefers simplest structures prefers parse with fewest nodes. Less nodes cost less, are hence more preferred and more probable.
Computational linguistics in language processing
Analyzing language processing by computational methods
Serial parser & plausibility
Sometimes less probable and more specific interpretations are favoured since they can be more easily fixed. –>serial parser prefers interpretations that can be rapidly falsified.
Lexical information & probability
Lexical probabilities used in parallel models. Words change parsing preference and probabilities.
Requirements for probabilistic language comprehension
Model of general knowledge;
Theory of mind;
Principles of pragmatics
Parsing preference based on:
Plausibility and statistics;
Proximity and relation of words (which words occur together)
Language acquisition & probability
Probability added to empiricist language acquisition models by investigating linguistic corpora and to innate language acquisition model (Chomsky) by chooses between innate candidate grammars based on probability of experienced language.
Computational linguistics & language acquisition
Using context-free grammar and parsed trees to study language acquisition.
Poverty of stimulus
We learn language fast and can generate new sentences, by having only a limited input. This supports the existence of a faculty of language.
Context free grammar
Rules that always apply in the same way and are independent of context
Parsed trees
Trees are already parsed and have hence less probabilities
Distributional methods
Linguistic items with similar distributions have similar meanings. Hence meaning can be inferred by relationships of words.