.. Flashcards

1
Q

what is a decision

A

‘a commitment to a course of action that is intended to produce a satisfying state of affairs’ (Yates, Veinott & Patalano, 2003).
It:
Involves a choice between actions (which may include inaction)
Is goal-oriented
Usually involves evaluation of outcomes
But, there are different types of decisions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

strategic decisions

A

Strategic decisions: involve the general direction taken by an individual or organization, some time to make and involve outcomes that occur in the long-term.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

tactical decisions

A

Tactical decisions: involve the implementation of strategy, stream of smaller scale decisions, take shorter time to make, outcomes that occur in the shorter term.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

operational decisions

A

Operational decisions: day to day decisions making needed to execute plans and tactics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

unstructured decisions

A

Unstructured decisions: general understanding for the need to act, but no clear idea about goals/objectives, the alternative actions that are available or how they should be evaluated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

structured decisions

A

Structured decisions: clear objectives, a clearly defined choice set and an understanding of how the alternatives are to be evaluated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

risky decisions

A

Don’t know which of several outcomes may occur (e.g. Investment depends on whether upturn or downturn in the economy)
• Involves risk and uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

riskless decisions

A
  • Outcomes known (know what we get if we choose something). Problem is knowing which is the best value option
  • Need to trade-off cost and value
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

normative theories

A

Normative theories prescribe what people should do if they wish to be rational decision makers.
These theories determine that to be rational people must maximise utility in the way prescribed in economics.
The theories are formal, based on sets of axioms or assumptions.
All the assumptions are deceptively simple and reasonable and there is an underlying mathematical proof which shows that for any person accepting the assumptions (and most people do when they are presented with them), it is possible to identify the rational or best course of action for that individual in that situation.
This normally involves maximising the best interests of the decision makers or maximising expected utility.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

descriptive theories

A

Descriptive theories are concerned with explaining what people actually do and how they do it, rather than what they ought to do.
Descriptive theories may be developed in terms of mathematical models describing how the decision information is combined to determine choice or they may be developed in behavioural terms identifying the psychological/thinking processes involved in determining choice.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

prescriptive theories

A

Prescriptive theories take account of what we know about the difficulties people have in making decisions and use this to prescribe procedures that people should follow if they wish to make a good decision.
In the early days this involved procedures that ensured people were consistent with the normative theory (i.e. maximise their utility). This focus has changed in recent years and moving from System 1 to System 2 thinking has become the focus.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

decision theory - as an example of normative theory

A

von Neumann & Morgenstern (1947) developed a way of formally quantifying decisions
• based on a number of axioms (rules)
• covers both riskless and risky decision situations (we will focus on risky decisions)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

example of decision theory

A
A = y axis
S = x axis
A1...An = choice alternatives
S1...Sn = possible states of the world
O11...Onn = all possible outcomes

Value (utility) of outcomes to the decision maker which are subjective, are put in all the possible outcome boxes.

Probabilities of states of the world (based on intelligence)

to determine the best alternative, need to calculate the subjective expected utility for each alternative (sum of each option value x probability of state)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Axiomatic theory

A

normative theory is an axiomatic theory
example axioms:
- decidability; one must be able to decide between options
- transitivity; if a > b & b > c then a > c
- invariance; underlying structure, not surface structure. in other words the same question asked in different ways should yield the same answer.
- independence of utility and probability; how important an event is to you should not influence your judgement of how likely it is.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

conclusions from normative decision theory

A

provides the rational solution for a decision
• offers terms in which decisions can be conceptualised
• shows how the best course of action can be derived in a System 2 manner
• produces highly influential models

But:
• Does it describe how we make decisions?
• If people do not decide like this, are they irrational?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

system 1 thinking attributes

A
fast
unconscious
automatic
everyday decisions
error prone
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

system 2 thinking attributes

A
slow 
conscious 
effortful
complex decisions
reliable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is limited capacity processing and how does it impact decision making

A

Capacity is limited in terms of amount and speed of processing
Information decays over time unless we rehearse it
But rehearsing also uses capacity and doing this while carrying out other computations is difficult
If we cannot register and rehearse two five-digit numbers what about decision making!?
These limitations and how we cope with them crucially affect how we think and the accuracy of this thinking in decision situations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

George Miller on limited capacity processing

A

George Miller
Short term memory has three key aspects:
1. limited capacity (only about 7 items can be stored at a time)
2. limited duration (storage is very fragile and information can be lost with distraction or passage of time)
3. encoding (primarily acoustic, even translating visual information into sounds).
There are two ways in which capacity is tested, one being span, the other being recency effect.
The Magic number 7 (plus or minus two) provides evidence for the capacity of short term memory. Most adults can store between 5 and 9 items in their short-term memory. This idea was put forward by Miller (1956) and he called it the magic number 7. He though that short term memory could hold 7 (plus or minus 2 items) because it only had a certain number of “slots” in which items could be stored.
However, Miller didn’t specify the amount of information that can be held in each slot. Indeed, if we can “chunk” information together we can store a lot more information in our short term memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Herbert Simon on limited capacity processing

A

Herbert Simon:
Economic models make unrealistic assumptions
They assume unbounded rationality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

How do people cope with cognitive limitations (limited capacity processing)

A
  • use system 2 thinking, engaging in systematical, analytical, conscious, rule based thought.
  • use system 1 thinking (simplify), simplify so you can think within capacity limitations
  • use system 1 thinking (match to past experience)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Why use system 2 thinking?

A

People can be motivated to engage in System 2 if something is important enough to expend energy on
Having sufficient time is another reason – people often revert to system 1 when under pressure.
It is also important that decision makers have appropriate information available.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Why use/not use system 1 thinking (simplify)?

A

When we satisfice we don’t get the best, but we do get something adequate.
E.g. “Satisficing” choose first reasonable option (Simon, 1956)
There is always a trade off between the value of the decision and the time it would take. Problems arise if you don’t see this explicitly, particularly if you try to weigh all options, fail and then pick something inappropriate when confused by the situation or in a panic due to time pressure

Simpler and leads to resolution in the time allowed

Can use even simpler strategies, eg. recognition rule, You might go for brands you know or recognise when you buy something new like a phone or a tv. Gigerenzer also puts forward a simple rule of “if you recognize one object but not the other assume the first has a higher value”.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

why use/not use system 1 thinking (match to past experience)

A

Try to recognise current situation from past experience and do what worked before
People store past instances AND actions for dealing with them
BUT for this to work you need expertise. You recognize things you have learned or experienced. You aren’t trying to think of what could be wrong – you know.
Many situations you face as a manager don’t really have a good match to many previous examples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

problems with system 1 (simplify)

A

People adopt System 1 to deal with situations without recognising its impact on the quality of decision making
While System 1 thinking is generally quite efficient it has the potential for catastrophe in some situations (i.e. where the environment not forgiving of error)
It’s difficult to defend choice because the decision maker has only fleeting knowledge of what they did.
A basic problem is knowing whether a previous decision was a good one (as in the previous example, with the credit managers).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

problems with system 1 (matching using experience)

A

Doing so against past, not present and future. Sudden changes such as increased threat of terrorism not taken account of in past experience - the usual action takes no account of changes.
People are actually evolved for pattern recognition thereby mismatching and minimising differences between situations
Matching after just limited experience – inappropriate categorisation and response links

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

problems with system 2 thinking

A

Effortful
Time consuming
Detracts from other tasks
May not have the skills / know the process to follow

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

interaction between system 1 and 2

A

Kahneman and Frederick (2002)
The two systems operate in parallel
S2 monitors output of S1 to determine whether acceptable or if it needs to step in to modify, correct or change.
System 2 thinking needs attention and self control – these need energy and there are real world implications if this is depleted.
bat and ball problem - suggests that people who considered the intuitive response first but were able to overcome it are more aware of the difficulty of the problem. (Frederick 2005)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

cognitive reflection and rational thinking

A

Cognitive Reflection Test: Three-item test developed by Shane Frederick (2005).
When people respond incorrectly, they often provide the intuitive response.
The solution is generally understood when explained

30
Q

challenges to the dual system approach

A

Evans and Stanovich (2013) review and address several criticisms

  • They suggest using the terms Type 1 vs. Type 2 processing, and focus on defining features of each. as ‘system’ refers to different parts of the brain, and some feel it is too strong of a term.
  • (According to them, the defining features of Type 1 processing are that it does not require working memory and it is autonomous, while the defining features of Type 2 processing are that it requires working memory and it involves cognitive decoupling (i.e., the ability to run thought experiments or engage in hypothetical thinking).

An influential critic is Gerd Gigerenzer (2011). he says “Dual-process theories of reasoning exemplify the backward development from precise theories to surrogates”

31
Q

when choosing, how do we gather and interpret information of 2 kinds

A

Likelihoods that particular outcomes have/ are/ will happen
Often have incomplete information about the past/present and always about the future
Need to make a judgement ,i.e. assess likelihoods/risk (the probability element of decision theory)
The attractiveness of outcomes (i.e. subjective value or utility) to me or my organisation (this relates to the utility part of decision theory).

People usually gather and interpret using System 1 thinking
Can lead to biased interpretations and conclusions

32
Q

natural frequency approach

A

100 children

3 abused 97 not abused

pos test (95%)  Neg test
= 3                     = 0
pos test (10%).  neg test
= 10                   = 87

actual likelihood of abuse: 3/(3+10) = 23%

33
Q

screening problems system 1 or 2 thinking?

A

System 2 rule is Bayes Theorem but usually not used
Instead people often use System 1 thinking:
Either focus on the ‘evidence’ and some intuitive re-adjustments
OR just focus on the ‘initial likelihood’ and use that.
Can’t engage System 2 because not taught it
Evidence that experts, including expert witnesses, fall foul of this bias – usually due to ‘base rate neglect’.
They do not follow System 2 thinking, misperceive the risk and end up taking a bad decision because of it

34
Q

Examples of Bayesian inferences in expert judgement

A

OJ Simpson’s trial for the murder go his ex-wife (found not guilty)

  • defence team argued that battery is not a good predictor of murder as they claimed there was 1/2500 odds OJ murdered his wife. The evidence for this was 2.5 - 4 million women are battered annually by their husbands and there are only 1,500 homicides.
  • this was not the relevant statistic, as we are not predicting whether the was murdered following domestic violence.
  • we know she was abused so what was the probability she was murdered
  • natural frequencies (based on Bayes theorem of underlying base rates) showed a 88.9% chance she was murdered by OJ.
35
Q

implications of assessing likelihood and probabilities

A

People bad at assessing risk and probability and this can lead to poor judgements and inappropriate decisions.
People need to be taught how to reason using System 2 forms of thinking, but rarely are.

But do people realise that their judgments are that bad?

36
Q

Calibration of confidence judgements

A

When people are asked a number of multiple choice questions and also asked to give their confidence level for each answer, the averages can be used to create a graph.

probability who answered correctly y axis
subjective assessment of confidence level x axis
the difference between perfect calibration line and subjective estimate line is called the overconfidence gap.

37
Q

Impact of confidence judgements

A

“It’s not what we don’t know that gives us trouble, it’s what we know that ain’t so” (Will Rogers).
Bazerman (2014) suggests overconfidence contributes to a large number of failed mergers and acquisitions
Related, optimism bias – tendency to be overly optimistic about good things happening and not pessimistic enough about potential bad events!
People (managers) are generally optimistic and overconfident.
Known to have been a major cause in individual and organisational failure.
We need to understand why and then address.
Next time we identify the forms of System 1 thinking (heuristics) that people often use in these kinds of situations that can lead to biased judgements

38
Q

confirmation thinking

A

Thinking that involves looking for confirming evidence (and not disconfirming evidence) when testing ideas about what has, is or is going to happen

It is a classic form of System 1 thinking
Major cause of individual and organisational failure
Both experts and novices use it

Example: Personalised social media newsfeeds means that people tend to only see news that relates to their interests, which is therefore likely to re-enforce their views. Hear Eli Pariser (2011) discuss what he calls the “filter bubble,” here.

39
Q

Overcoming confirmation thinking

A

As a “natural” form of thinking, it is hard to break
Koriat et al (1980) found:
forcing people to ask disconfirming questions makes judgements more accurate
To encourage System 2 thinking in this context:
Demonstrate normal ways could be flawed
Get people to reflect on how to disconfirm or “What would it take for me (us) to be wrong?”
Fault trees (Requires identification of all the reasons a plan / idea may not work)
Structure in disconfirming protocols, e.g. regular reviews / devil’s advocate/ dialectic enquiry

40
Q

What is a heuristic

A
A heuristic is a simplifying strategy, rule of thumb or mental shortcut that people rely on when making judgments and decisions. Once again, demonstrating classic System 1 thinking.
3 main examples being:
Representativeness
Availability
Anchoring and adjustment
41
Q

The representativeness heuristic

A

Definition: Instead of considering the true likelihood of an event, people use degree of match / how well one thing resembles another, as a proxy for determining probability.
Use degree of match because it is something that we do all the time in other contexts so a ‘natural’ form of thinking.

e.g. 2 sequences of heads and tails, choose which one is randomly determined
:
H T H H T H T T H H T
H H H H T T T T H H H

most people would choose first sequence as it is more representative/matches our internal model of random sequences

42
Q

implications of representativeness heuristic

A

Try to teach people to be aware of representativeness and its associated biases.

Need to encourage System 2 thinking - judgements should be based on data rather than ‘experience’ or “feeling”.

Need to think careful about what data is available, what we need, how to make sense of it and ways of presenting it so people will understand it properly.

43
Q

Availability heuristic

A

Definition: The availability heuristic involves evaluating the likelihood of events in terms of the ease with which past occurrences of those events happening can be brought to mind rather than determining true probability.
The System 2 approach would require knowing relevant statistics and apply to particular situation.

Most don’t know the relevant statistics but still make judgement by replacing a hard question by an easier one that is more System 1 in nature (ease of recall of examples)
Builds on natural form of thinking – storing relative frequencies of events

Some face validity, but we overestimate high availability risks and ignore the ‘facts’, e.g. fear of crime

Caused by a combination of availability plus media attention

Spills over into policy of organisations and governments

44
Q

Who uses availability heuristic

A

Used by experts as well as novices
Senior IT security managers and prediction of threats:
Overestimate ‘available’ threats – hackers
Underestimate “unavailable” threats – downtime due to under-power unable to deal with peaks of demand
Expenditure on security not commensurate with the risks involved.

45
Q

Implications of availability heuristic

A

Need to encourage System 2 thinking - judgements should be based on data rather than ‘experience’.

Need to think careful about what data is available, what we need, how to make sense of it and ways of presenting it so people will understand it properly.

46
Q

Anchoring heuristic

A

Definition: Heuristic that involves making estimates based upon an initial value (sometimes relevant, e.g. last year’s sales figures when predicting this year’s figures, sometimes not) and then adjusting from that value, often insufficiently.

47
Q

examples of anchoring heuristic

A

Computationally simpler than other example heuristics, but open to error. Examples:
Ariely et al (2003) – example of anchoring on irrelevant number (e.g. NI number). Those with NI ending 80 – 99 made offers for goods 3 times as high as those ending with 00 – 19
Wansink et al (1998) – Supermarket experiment with soup special offer, sign limiting purchase to no more than 12 cans. Without the sign almost half the customers bought one or two cans, with the sign most bought between 4 and 10 cans!

48
Q

challenges to the heuristics and biases literature

A

Methodological:
Continuity argument:
Appropriateness of normative solution:
Little predictive ability to determine which heuristic chosen in a particular situation:
The structure argument:
The citation bias:
Difference between frequency judgements and single-case judgement:

Gathering and interpreting evidence about risk and uncertainty is subject to predictable errors and biases (in part because we rely on System 1 Heuristic thinking)
Even experienced and knowledgeable decision makers fall into these traps, leading to major individual and organisational failures.

49
Q

ecological rationality (Gigerenzer et al.)

A

This work suggests that, under particular circumstances, simpler and less effortful processes can lead to (more) accurate responses.
Emphasizes the relevance of taking into account both the cognitive limitations of the decision maker, and the structure of the environment.
People use a number of simple ‘fast-and-frugal’ heuristics that are effective because they exploit idiosyncrasies of our environment
Mind as an ‘adaptive toolbox’ of specialized cognitive heuristics

50
Q

what is a decision frame

A

Internal representation of a problem inside the head of the decision maker. The act of building it is called framing.

51
Q

What is prospect theory

A

A (descriptive) behavioural economic theory that describes the way people choose between risky alternatives where the probabilities of outcomes are known (developed by Kahneman & Tversky; 1979, 1992)

52
Q

problems of modelling decision problems

A

People build models of problems that they face
They accept models uncritically
Models tend to be:
Simple
Often differ from reality
Often only one of several equally plausible options

53
Q

What is loss aversion

A

People’s tendency to strongly prefer avoiding losses to acquiring equivalent gains. Most studies suggest that losses are twice as powerful, psychologically, as gains.

54
Q

what is a negative/positive frame

A

modelled as losses = negative frame

modelled as gains = positive frame

55
Q

The framing effect

A

When choosing between 2 equivalent options where one option is worded in a negative frame and the other positive, most people will choose the positive frame option.

e.g.
(A) If Program A is adopted, 200 people will be saved.
(B) If Program B is adopted, there is one-third probability that 600 people will be saved and two-thirds probability that no people will be saved.

both are equivalent but A is positively framed.
Kahneman and Tversky’s (1979) experiment showed significant preference reversal depending on how question framed.

people are risk seeking in losses (negative frame)
people are risk averse in gains (positive frame)

violates the invariance axiom underpinning normative theory

56
Q

Real life example of framing effect

A

Hodgkinson et al (1999): Strategic case study on bank with strategic choice at end (same structure as AD, but more detailed & realistic)
Branch based strategy (safe option)
Internet banking (risky option)
Choice framed either positively or negatively
Experienced bankers (considerable knowledge and experience) were risk averse in gains and risk seeking in losses
Decisions indicative of S1 not S2

57
Q

prospect theory Kahneman and Tversky (1979)

A

Like Decision Theory (DT), PT considers both subjective utilities and probabilities, but these are subject to cognitive distortions (System 1 thinking) so derived differently to DT. Specifically:
The value of outcomes (prospects) is evaluated relative to a reference point (often the status quo)
Probabilities are assigned a weight, but there is not a linear relationship between the actual probability and the weight given to it

58
Q

the first stage of prospect theory

A

Editing:
Sometimes the decision model we generate, and the way we frame it, is wrong or just one of several possible ones (remember the visual illusions)
Many mental operations occur during this stage. The most important is coding
Coding is concerned with how we model, and therefore value, outcomes
This judgment is based on relative rather than absolute judgements: i.e. relative to a reference point

59
Q

the second stage of prospect theory

A

Evaluation
Same as Decision Theory but with cognitive distortions
In PT utility judged as gains and losses from a reference point not as final states of wealth
So we value different amounts of gain and loss using the value function (VF)
graph of value y-axis and gains x-axis
convex for losses (downwards curve)
concave for gains (upwards curve) showing decreasing marginal utilities

60
Q

how does the value function explain decreasing marginal utility

A

The concave shaped curve means that equivalent increases in gains do not result in equivalent increases in value.
Bernoulli - the utility is dependant on the circumstances of the person making the estimate

61
Q

how does value function explain framing effects

A

The shape of the value function graph is critical in determining risk attitudes
in positive frame, the sure thing holds more value than the gamble and so is preferred.
in the negative frame the gamble holds more value than the sure thing and so is preferred.

62
Q

What impact does the value function have on loss aversion

A

losses are steeper than gains by about two times
losses loom larger than equivalent gain, explaining loss aversion

example of loss aversion
- in the USA there was a move to allow retailers to charge extra for users of credit cards compared with cash. banks insisted this be called a cashed discount (framed as gain) rather than a credit surcharge (framed as loss)

63
Q

endowment effect

A

The tendency to value something more once we own it. Leads to discrepancies between buying and selling prices, willingness to pay and willingness to accept.
Once people own something, value it more since giving it up is conceptualised as a loss (more impactful than equivalent gain)
Kahneman, Knetsch & Thaler (1990). Three groups:
‘sellers’ given a mug (owned it) indicate minimum selling price.
‘buyers’ given a sum of money could keep or buy mug indicate their buying price

64
Q

status quo bias

A

The tendency to remain in the same state (status quo or default) rather than changing and moving to another state, i.e., the tendency to leave things as they are.

USA motor insurance example (1992): USA changed their motor insurance laws so that consumers could have a reduced right to sue for pain & suffering in exchange for lower insurance rates
In New Jersey where the default was reduced right to sue – only 20% purchased the full right
Compared to Pennsylvania where the default was full right to sue: 75% kept the full right. This is the status quo bias – caused because giving up the full right was conceptualised as a loss

65
Q

difference between prospect theory and decision theory

A

in PT, probabilities have impact that are different from their numerical value.
This is another cognitive distortion.

66
Q

probability weighting function

A

probability weight is a measure of the impact that probabilities have on decisions

graph of probability weight on y-axis and probability on x-axis

overweight small probability events (low probability, above proportional line)
underweight high probability events (high probability, below proportional line)

cognitive distortion seen as increases are not equivalent throughout

67
Q

probability weighting weighting function tendencies

A

People tend to:
Underweight medium - high probability events
Overweight small probabilities (eg winning the lottery)
Very heavily weight certainty

Slovic et al. (1982) imagine an insurance that provides full protection against fire but not flood. People find the insurance more attractive when described as ‘full protection against fire’ rather than “an overall reduction in the probability of loss” (the former gives certainty in one area, despite being saying nothing about all other risks).

68
Q

the certainty effect

A

The tendency to weigh shifts from certainty to uncertainty (or vice versa) more than equivalent shifts from one uncertain state to another.

69
Q

The Pseudo-Certainty Effect

A

Related to the certainty effect. However, in this case an outcome that is actually uncertain is weighed as if it were certain. That is, certainty is not real but only perceived.

70
Q

Disposition Effect

A

The tendency to hold on to stocks that have lost value (relative to their purchase price) too long, and to sell those that have risen in value. Due to the fact that people dislike incurring losses much more than they like incurring gains, and are willing to gamble in the domain of losses

Shefrin & Statman (1985):
People dislike incurring loss and gamble in losses
So they hold on to losing stocks too long and too eager to sell those that have risen in value.
The Disposition Effect explains market inefficiencies

71
Q

four fold effect

A

The tendency to be risk averse in gains & risk seeking in losses for medium to high probabilities, but risk seeking in gains & risk averse in losses for very small probabilities