models Flashcards
define model
simplified or idealized representation of a thing
define statistical model
mathematical relationship between variables, that hold under specific assumptions
behaviourism view of cognitions
“black box”
input –> output, unknown what happens in the brain between the two
cognitive box and arrow models (and example)
models that describe the relationship between different mental processes → assumption that the mind operates like multi-staged information-processing machines
started simply but can be very complex
e.g. Broadbent (1985) levels of processing of information
studying cognitive models
manipulating input and observing output to figure out what occurs in between
e.g. gorilla video experiment about paying attention to the stimuli - why does it change when attention given to it - use Broadbent’s model
formal cognitive models
aka computational models
a mathematical description of the relationship between mental processes
usually expressed through computer code
assumptions are explicit
often provides numerical predictions
informal cognitive models
verbal description of the relationship between different cognitive procedures
often some assumptions are implicit
often provides only directional predictions
define simplification and abstraction
simplification = not going to describe all the info, only critical parts
abstraction = generation of general rules and concepts from specific info
applying simplification and abstraction to models
- need to balance simplification and abstraction in models
- depending on what question is being asked or process is being conveyed
- emphasis on certain elements that are the purpose of the model e.g. model train about how the wheels work doesn’t need complex engine
- simple is not a criticism of a model → all models are simple → only bad if it is simple in what it is trying to express
- all models are wrong due to simplification, but are useful
predictions and or explanations in models
non-scientific explain after the fact - cannot provide falsifiable predictions (Karl Popper)
scientific models must produce predictions
scientific models predictions
can be directional or numerical
directional = one thing is more/less than the other
numerical = use a data set and correlations to predict what would happen with a new data point e.g. find a person’s income and predict their happiness based on previous data –> can extrapolate current data
* numerical predictions can vary in levels of accuracy
theoretical vs statistical models explanations
theoretical always provide explanations for data found
goes beyond the statistical models - they only describe the relationship without explanation
hierarchy of research (5)
framework
theory
model
hypothesis
data
hierarchy of research: framework
conceptual system that defines terms and provides control e.g. cognitive psychology
hierarchy of research: theory and model
theory = scientific proposition that provides relations between phenomena e.g. early-selection theory
model = schematic representation of a theory e.g. Broadbent’s filter model
hierarchy of research: hypothesis and data
hypothesis = narrow testable statement
data = collected observations often as part of an experiment
models: explanation without prediction
models which can predict group differences (very broad) but not individual cases
models: prediction without explanation
e.g. can predict whether an individual will get Alzheimer’s even though Alzheimer’s isn’t fully understood yet
e.g. computer models using data to give prediction - more data = more predictive power –> like a “black box” as the operator doesn’t understand what the computer is doing to find this
all statistical models fit this - give data without explanation
hierarchy of formal models (6)
framework
theory
specification
implementation
hypothesis
data
same as for research but model is split into specification and implementation
formal models: specification
formal description of relations described by a theory e.g. formal model of symbolic representations
formal models: implementation
specific instantiation of a specification e.g. computer program which can simulate and predict numerical outputs from an input
advantages of formal models (3 - brief)
more accurate predictions
counter-intuitive predictions
explicit assumptions
advantages of formal models: more accurate predictions
- easier to reject bad models - see if predictions are unreasonable
- select which experiments to perform - if two models give same prediction, that isn’t useful as we cannot tell which was actually correct - with numerical predictions (e.g. % rates) we can see which got closer
- subtle form of hypothesis testing - how close model is to predicting actual result - not just general trend but closeness to specific number/data
- prediction is more accurate - the model might still be bad and therefore prediction can be wrong even when accurate
advantages of formal models: counter-intuitive predictions
- clear description of predictions so when predictions seem wrong it is obvious
- hard with informal to notice when counter-intuitive predictions are made
advantages of formal models: explicit assumptions
- reveal unanswered questions, flaws in reasoning, contradictory/unreasonable assumptions
- can make assumptions transparent for others to see e.g. verbal models cannot show what occurs but formal can simulate this
limitations of formal models [wont be tested on this]
- requires expertise
- transparency
- best compared against other computational models, not against informal
- numerical predictions can be premature
changing the model takes time - this can limit progress - may seem to provide scientific validity even when it isn’t e.g. neural network models
- making a model simulate a cognitive task does not teach us about cognition
issue understanding the brain
can only sample small amount of activity from a small brain area
break down of brain data to understand it (Marr) (3)
- computation → problem being solved
- algorithm → steps/rules to solve problem
- implementation → actual machinery
focus too much on any of these 3 is bad -> hyperfocus doesn’t allow for interactions between the 3 - creates non specific theories that might not match
bottom-up approach to brain data
neuroscience and AI
implementation → rules → problem
example:
machinery of neural circuits → generate algorithm from these → what problems do these algorithms solve
Marr (1982) view on approaches to brain data
understand algorithms better by understanding the nature of the problem being solved rather than examining the mechanism
therefore use top-down approach
focus on mechanics takes away from other parts - too small detail without context
top-down approach to brain data
Marr’s view - and most cognitive psychologists
problem → rules → implementation
e.g.
problem to solve → what algorithms can solve this → how algorithms are implemented in neural circuits
Epstein (2008) - core reading
* misconceptions with modelling (2)
* reasons for modelling (there are 16 total, 5 written here)
goal isn’t always prediction
theories don’t arise from and summarise data - they often precede and guide data collection instead
reasons:
guide data collection, explanations, raise new questions, illuminate core dynamics, suggest analogies
Guest and Martin (2021) - core reading
* why is using models helpful
* what is an open theory
models are transparent in how conclusions are drawn –> forced to analyse information closely
open theories are developed explicitly and defined formally
* can recognise flaws in replication
* open theories are better to use for their transparency and what they can tell you about the theory being tested