Cognitive heuristics Flashcards
delving into the AI archives
Social psychology borrowed the heuristic term from AI
Computers, algorithms and the often-futile search for an optimal solution
Settle for approximations
We don’t always try to find perfect solutions
in general
“…the rules people use are fairly rational ones…But the rules are only useful if uncertainty exists or if too much effort is required to arrive at a more complete and accurate judgment. When people rely on heuristics in cases where they could well engage in more accurate types of analysis, and when uncertainty is reduced by the presence of useful data, then heuristics are a source of bias.” (Moskowitz, 2005)
what should we do?
Assess available alternatives for likelihood and worth of outcomes they promised (probability and value)
Calculate utility of each outcome (product of value and probability of each outcome)
Choose the option that maximises utility
In plain English…we make a decision that is most likely to deliver the benefits that we desire
we know this doesn’t always happen
We are not computers!
There may be too much information to sift through rationally
We rarely have the time
We can’t be sure of the outcome (we might still not be happy) – most life decisions do not come with a crystal ball or a guarantee
satisficers v optimisers
Most of the time we are ‘satisficers’ – making adequate inferences and decisions rather than
‘optimisers’ – drawing the best possible inferences and hence reaching the best possible decisions (March & Simon, 1958)
Kahneman and Tversky
Highlighted some of the ways we satisfice, relying on heuristics (using economic theory)
Shone a light on something we all do (examples)
system 1 and 2 thinking
System 1 allows you to: orient to the source of sudden sound; complete the phrase “bread and…”; Answer to 2 +2 (automatic)
System 2 allows you to: brace for the starter gun in a race; look for a woman with white hair; fill out a tax form (require attention and effort)
Not different from the dual process models previously mentioned
To do with resource availability
Good at spotting when system 2 thinking needs to happen
what happens when you pay attention to one thing?
You might not pay attention to something else
Inattentional blindness:
o We can be blind to the obvious
o We can be blind to our own blindness
applying the systems approach to our use of cognitive heuristics
see notes for long example
sample size
Sample size matters, but we often fail to take account of it
Statistics produce many observations that seem to beg for causal explanations but are simply due to chance
System 1 is the mode of thinking that leaps on causal connections
It runs ahead of the facts and jumps to (often wrong) conclusions
what other heuristics highlight system 1 thinking?
Representativeness
Availability
Adjustment and anchoring
representativeness example
see notes
What happened to taking the base rate information into account?
What about the fact that you were told the description was based on results of ‘uncertain validity’?
Would remembering to bear these things in mind cause you to revise your nerdiness characterization of Tom?
representativeness heuristic
a mental shortcut whereby instances are assigned to categories on the basis of how similar they are to the category in general
representativeness empirical illustration
Tversky and Kahneman (1974)
Estimate probability a man was either engineer or lawyer
Man sampled at random from group of engineers and lawyers
Some told: 70% engineers, 30% lawyers
Others: 30% engineers, 70% lawyers
Some got (useless) personality profile
Some didn’t
Tversky and Kahneman (1974) results
When people didn’t get a personality profile…estimate reflected base rates (if told 70% lawyers, estimated 70% chance he was)
When they did get profile (even though it was rubbish!), they became less rationale and ignored base rate information and essentially guessed (50/50 judgments)