Guide til Kahnemanske vokabular Flashcards
Affect heuristic
Our immediate, involuntary emotional response influences our decision. Affect heuristics makes us see predominantly benefits (or good things) when we have a positive emotional response, and risks (or bad things), when we have a negative emotional response of a certain choice. For example, displaying photos of dead eagles close to a windfarm will evoke a negative emotional response towards wind farms, and make people less receptive to it, even though the cases are rare, and eagles learn to avoid the turbines with time.
Anchoring effect
The tendency to rely on an initial ‘anchor’ when making subsequent judgements or interpretations related to numbers. The initial anchor might even be completely unrelated with the decision problem. There are two explanations for this effect. First, this might be the result of priming (works of system 1), that is, our decision is influenced by a first stimuli, unconsciously (!). Second, insufficient adjustment (works of system 2), which is about a failure to adjust sufficiently from the initial anchor.
Availability heuristic
When things we remember easily, are imagined to be larger / more widespread than they really are. The problem is that we rely on examples that come immediately to mind when making decisions, which might not be the most relevant or common cases, and therefore it undermines our ability to judge frequency and magnitude of events. For example, when people impacted by a flood decides for a premium flood related insurance as they think floods are more likely than they are, as floods are readily available in memory.
Confirmation bias
”Confirmation bias describes each person’s underlying tendency to notice, focus on, and provide greater credence to evidence that seems to confirm his or her existing beliefs.”i This is important, as we notice and give attention to things that confirms what we already believe in, these pieces of information will reinforce our beliefs. We will tend to fail to notice arguments or information that disconfirm our beliefs. For example, if we believe in nuclear power stations, we will tend to notice the arguments supporting it.
Framing
The way in which a problem is presented influences the way we understand it and thereby it influences the outcome, for example, 90% fat free vs. 10% fat, or 2% death toll vs. 98% of surviving rate.
Heuristic
Unconscious or conscious mental short-cut that makes us consider only a fraction of the available information in our decisions.They help us make quick decisions, sometimes they are imperfect (biased), sometimes they work well.
Overconfidence
This is part of a family of biases related with optimism. It is related with our view of ourselves: we think we are more knowledgeable/ capable than we are.
Planning fallacy
“The tendency to underestimate task-completion times and costs, even [when] knowing that the vast majority of similar tasks have run late or gone over budget” (Flyvbjerg et al, 2009, p. 174) Estimates are usually closer to best-case scenarios and could benefit from consulting statistics on similar cases.
Prospect theory
A theory developed by Kahneman and Tversky in 1979 that challenges the classic expected utility theory. It describes how individuals give more weight to their loss than their gain. It is based on three elements: a reference point (reference point), decreasing utility (as it is known from microeconomics) and loss aversion (loss aversion).