Chapter 3: Biases in thinking and decision-making Flashcards
System 1 and system 2 thinking
Proposed by Daniel Kahneman in 2003 as an extension to the information-processing approach: differentiation between two independent systems
System 1: intuition (fast, instinctive, automatic)
Developed as an adaptive reasoning mechanism based on prior experience that enables us to act fast and fairly accurate. Works best in predictable environments.
System 2: slower, analytical, conscious
Evolved later with the development of language and abstract reasoning.
We apply system 1 the most; typically in predictable ad familiar environments, as we already know what to do.
The biases in decision-making (according to the IB book)
1) The tendency to focus on a limited amount of available information
Selective attention; schemas, asymmetric dominance, framing effect
2) The tendency to seek out info that confirms pre-existing beliefs
Confirmation bias, congruence bias, illusory correlations, and implicit personality theories.
3) The tendency to avoid mental stress of holding inconsistent cognitions
Leon Festinger & Theory of Cognitive Dissonance
Asymmetric dominance
Having to choose between two options can be hard. Adding a third, less desirable but similar decoy option manipulates us into picking the option it most resembles.
Confirmed by Huber, Payne & Puto (1982)
Huber, Payne & Puto (1982)
A: To study decisions involving an asymmetrically dominated decoy.
M:
P: Participants were required to choose a vacation destination (or anything for that matter)
Decision environments included two or three alternatives, with each alternative defined of two attributes.
R: Choice reversals were not large (but statistically significant) - 3-9% of participants switched their choice in the predicted direction when the third/decoy alternative was added.
C: Participants are likely to select the dominantly placed option when given an alternative option
E:
Framing effect
The way something is framed will affect choice, even if the outcomes are exactly similar (logically)
Theory by Kahneman & Tversky (1978), confirmed in 1981
Kahneman and Tversky (1981)
A: Investigate prospect theory and framing effect.
M: Independent measures design
P: Subjects were given a problem: USA are preparing for the outbreak of an Asian disease, which is expected to kill 600 people. Two alternative programs have been proposed.
Two subject groups where given the same programs, but they were formulated differently. Subjects then had to pick either program:
https://scontent-arn2-2.xx.fbcdn.net/v/t1.15752-9/83863623_902498510163085_3283472360006483968_n.png?_nc_cat=108&_nc_ohc=Ex99NRef8z4AX8BdXto&_nc_ht=scontent-arn2-2.xx&oh=0e8b3be7f75674c22f21d38522453cfc&oe=5EC1FDE3
R: Program A: Group 1 (72%), group 2 (22%). Program B: Group 1 (28%), group 2 (78%).
C: The two programs are identical, there is only a change in the reference point; a change in the formulating of the question (framing effect). People were trying to avoid risk.
E:
Wason’s (1968) Four Card Problem
A: Investigate confirmation bias (did confirm)
M:
P: Subjects were given four cards (A, K, 4, 7) and given the following rule: “If a card has a vowel on one side, then it has an even number on the other side.” Participants were required to only turn over the cards necessary to prove the statement true or false.
R: Subjects mostly picked “A only” or “A and 4”. The only cards that could potentially refute the rule are A and 7.
C: We only pick the choices that confirm what we believe and not those that could refute our pre-existing beliefs.
E: Confusing task? Representative of confirmation bias?
Congruence bias
Not just because we are more attentive to information that can potentially support pre-existing beliefs but also seek positive results over useful information.
Confirmed by Tschirgi (1980) - John and the cake
Tschirgi (1980) - John and the cake
A: Investigate congruence bias
M: Survey
P: John makes a cake. It tastes good and he thinks its bc honey. Participants were asked what John should do if it was the honey:
A: Keep the honey and change everything else
B: Only change honey
C: Change everything.
R: People picked A
C: A does not make logical sense; they wanted positive results over useful information (option C) - hence confirming congruence bias.
E: Framing of the options?
Illusory correlations and implicit personality theories
Illusory correlation: A belief that two phenomena are connected though they are not.
Implicit personality theories: our beliefs of certain behavior that we attribute certain people.
Supported by Chapman & Chapman (1969) ink-blot test
Chapman and Chapman (1969) - Homosexuality in Rorschach cards.
A: Investigate the role of confirming pre-existing beliefs in the formation of illusory correlations and implicit personality theories.
M: Ink-blot tests
P: Used Rorschach ink-blot test. They focused on the diagnosing of male homosexuality (specific cards were found to be linked to gay). Asked psychodiagnosticians, but the clinicians selected the wrong cards and signs of homosexuality: they simply picked the ones that were the most feminine, such as seeing a woman’s bra. Did the same with “naive” students, where researchers also included three statements on the cards (which were linked to homosexuality and what could be seen on the cards.) They made the same invalid choices.
R/C: Participants selected invalid correlations and did so because of their pre-existing beliefs.
E: Confirms illusory correlations and implicit personality theories. A valid study? It was conducted during a time where homosexuality was considered a mental disorder.
Cognitive dissonance - Leon Festinger
The mental stress caused by incosistency between two or more contradictory beliefs or ideas, one’s actions and one’s beliefs etc. One will be driven to reduce the dissonance.
Supported by Leon Festinger 1956, Freedman and Fraser (1966)
Leon Festinger 1956 - Infiltrating a UFO cult
A: investigate cognitive dissonance
M: Case study
P: Observing a cult whose prediction about the world ending.
R: World didn’t end; they made up some new thing and said it didn’t end because their belief was so pure
C: when new information discomforts some new information is just added
E: wtf just infiltrated a cult?
Freedman and Fraser (1966)
A: Investiagte induced-compliance paradigm
The experimental group was asked to sign a petition on the issue of safe driving, a week later they were asked to put up a “drive safe” sign on their lawn. 55% of the experimental group agreed, 20% of the control group did. This study shows the effects of cognitive dissonance, as the experimental group would put up a sign that was congruent with the petition they signed: “Why would I sign the petition and not put up the sign?”
Heuristics
Shortcuts and incomplete, simplified strategies. They lead to cognitive biases but are useful. As more heuristics and biases have been discovered, the need for classification and search for common causes has risen.