2 3 Cognitive modeling Flashcards
What are advantages of formalized mathematical models in comparison to verbally formulated theories?
precision & consistency (verbal terms are rarely well defined)
verbal: assumptions must be made explicit
easy, efficient & precise communication of theories
useful as analytical tools (disentangle mixture into components of interest)
Why are free parameters necessary in formal models of cognition?
cognition is very complex (model with natural constants probably does not describe all processes -> better fit with free parameters)
processing shows inter- & intra-individual differences (some parameters not constant)
Why are free parameters problematic?
more free parameters => less specific predictions -> model allows more data patterns -> unfalsifiable
number of independent data points limits number of parameters one can estimate
What is the psychological idea behind Nosofsky’s generalized context model?
exemplar model of stimulus categorization
explanandum (to be explained): categorization & recognition probability of a stimulus
explanans (explaining factors): similarity is judged based on psychological distance which is determined by the number of overlapping features which is the sum of similarities to A exemplars divided by the sum of similarities to all exemplars
psychological distance is in a multidimensional psychological space, each dimension is a feature (e.g. size, color)
the probability to recognize something as old is the sum of similarity to all exemplars
What does Nosofsky’s generalized context model explain?
how new stimuli are categorized & how the probability of categorizing something to a specific category is calculated
What is the difference between simulation models & algebraic models in cognitive psychology? Is it a pragmatic or fundamental difference?
There are different approaches to solving them
simulation models hard to solve analytically => require PC simulation
algebraic models: analytical solution possible, set of equations & rules for their application, have closed form solutions
pragmatic difference: both are processing models
What is a measurement model in comparison to a process model?
measurement model: use data to estimate parameters that represent cognitive processes, explicit formal assumptions about relation between (unobservable) processes & empirical variables, summaries but no explanation of processes
processing model: formalize cognitive processes mathematically, describe hypothesized information processing steps (often with flow charts, simulation/algebraic models), theories of processes
A formal model defines the data as a function of model parameters. What needs to be done with the model equations to achieve estimation equations for free parameters?
In simple cases: solve equations for free parameters to yield them as a function of observed parameters
Least Squares Estimation (continuous data): minimize residual sum of squares -> computer simulation to get best estimate
Maximum likelihood estimation: make likelihood function: defines probability of data as a function of all possible parameter constellations -> find maximum of the function
What is the idea behind Maximum-Likelihood estimation of parameters?
for a certain formal model, estimates for all free parameters are inserted to find the combination of parameter values for which the observed data is most likely
Likelihood function expresses probability of the observed data as a function of the parameters
Why are local maxima of the likelihood function a problem for many algorithms of parameter estimation?
the actual form of ln(L) is unknown
many algorithms only look one step ahead -> only see if rising/decreasing => termination at local maximum
How is the problem with local maxima of the likelihood function solved?
repeat estimation several times wiht different random starting values
use optimization routines