Non-Standard Beliefs: Projection Bias Flashcards
Deviations from the Standard Model
Standard Model - preferences formula, see slide 3.
DellaVigna 2009:
- Non-standard preferences:
- time inconsistent preferences
- social preferences
- reference-dependent preferences - Non-standard beliefs:
- Over-inference - ability, information
- Law of Small Numbers
- Projection Bias - Non-standard decision-making
- Heuristics
- Impulse purchases
- Emotions
Loewenstein et al. (2003)
Projection bias = people mistakenly mis-predict their own future preferences by projecting their current preferences into the future and undermining how much they will change.
2 examples:
1. under-appreciation of hunger:
-shopping on an empty stomach leads people to buy too much food Gilbert et al (2003)
-Read and van Leeuwen (1998): workers to choose between healthy and unhealthy snack when in two states (satiated and hungry state=> hungry=unhealthy)
they realise their preferences change but neglect the magnitude of the change.
- under-appreciation of adaptation
- patients’ suffering would be a lot worse than actually happens - their predictions of quality of life is lower than actual self-reported quality of life
- exaggerate the effects of loss-aversion, overestimate the extent to which it will be painful
Habit formation:
when consumption is habit-forming, rationality implies an increase in consumption profile.
but projection bias leads to under-appreciation of the impact of current consumption via adaptation
implications:
- saving less than actually planned
- inadequate philanthropy
see graph, p. 23
Gilbert and Wilson (2005, 2007)
How we simulate future events in our minds
- Overestimation of intensity and duration of reactions to future events = IMPACT BIAS
IMPACT BIAS is due to two things:
- Focalism: underestimate the extent to which other events will influence our thoughts - neglect a broader domain of value
- Failure to anticipate how quickly make sense of things
Pre-feelings - simulated sense, before actual feeling
Actual feeling - at the actual point you are experiencing
but: Empirical Issues:
Observed diversification bias: decisions for the future often project more variation than end up desiring e.g. if between Snickers (S), Mars (M) and Galaxy (G) chocolates if prefer S, project S,S,S for the future. but, if projecting current preferences for the varied bundle => (S,M,G)
- Individuals may act according to current preferences because they deem them more valid
Conlin et al. (2007)
Field Evidence in Projection bias - clear exogenous change: weather
- Are people over influenced by the weather when making decisions? If fully rational, weather should be irrelevant.
- Projection bias: greater likelihood of returning cold-weather items if order was when relatively cold, so they project current state of cold to the future, then in the future they realise they didn’t need so many cold weather items - return
=> weather affects mood, reminds more marginal individuals to buy but robust to household FE.
Dalton and Ghosal (2012)
Any endogeneity of context suggests a potential mechanism for systematic mistakes
- in the model, ‘frames’ (=psychological states) are considered as preference parameters that are endogenously related by the actions you take
- if individuals take this feedback from their actions to their preferences into account, then they are rational
- if they fail to take this feedback of actions which affect references which affect utility, then they are irrational => they make systematic mistakes
=> projection bias: understood as behavioural error, when they neglect feedback from actions to preference parameters)
contrast to Conlin et al. (2007) because they take an exogenous variation (weather)
Heuristics
Heuristics are methods of arriving at satisfactory solutions without much computation => mental shortcuts
- Kahneman and Tversky 1974
Three ways of mistaking probability of future events:
- anchoring: excessive reliance on first piece of info
- availability: exaggerated importance of information that can be easily recalled
- representativeness: degree to which small sample reflects large population
Heuristics underlie two related phenomena: gambler’s fallacy and over-inference.
Rabin 2002
Inference by Believers in the Law of Small Numbers
Law of Large Numbers: a large random sample from a population will closely resemble the overall population
Law of Small Numbers: exaggerating the extent to which a small sample will reflect the entire population
Develops a model reflecting this error by urns
=> Assumption: subjects observe a sequence of signals drawn from an i.i.d process, but incorrectly believe signals drawn from as-if they are without replacement.
Example:
Observe: HHH
Rational Bayesian: p(H):0.5
Under the Gambler’s Fallacy, p(H)<0.5 because they see it as without replacement
e.g. if urn N=12 and HHH, then probability (6-3)/(12-3)=1/3 <0/5
if urn N=20, and HHH, then probability (10-3)/(20-3)=7/17=0.41
As the N –> oo, then will approach true Bayesian
=> individuals overestimate the probability the small sample extremes will reflect the entire population.
- In the gambler’s fallacy, they misinfer the serial dependence when observations are strictly independent
Gambler’s Fallacy: Financial Example
Suppose the return to a mutual fund is drawn from an urn with 10 balls, 5 Up and 5 Down, with replacement = independent events.
After (Up, Up):
- A rational Bayesian would infer that the probability of up=down=0.5
- A believer in LSN and gambler’s fallacy: P(Up)= (5-2/(10-2)= 3/8 < 0.5
Overinference: Hot Hand Effect
Inspired by a basketball fame when the player is on a streak of scoring –> hot hand effect
Belief in the LSN, leads to overinference: positive (opposite to gambler’s fallacy)
if a fair coin generates one head and tail, two heads in a row is incorrectly inferred to imply a biased coin => p(head)>0.5
Overinference: Financial Example
Mutual Fund, with manager of uncertain ability
2 urns, each 10 balls:
(1) well-managed fund, 7 UP and 3 Down
(2) poorly managed fund, 3 UP and 7 Down
Observe (Up, Up, Up)
Probability well managed - greater than 0.5, extremely likely well-managed fund because of overinference
The believer in the LSN: under predicts the likelihood that the fund was poorly managed - 2.8% vs rational 7.3%
=> Rabin, 2002: fictitious variation, incorrectly assumed to be good
Economic Implications of Overinference
Benartzi (2001): the degree to which employees invest in their stock depends on past performance of the firm
Bondt and Thaler (1985): investors over-infer from past performance: if stocks performed well ==> overincest, become overpriced and then underperform
Economic Implications of Gambler’s Fallacy
Barberis et al (1998): investors under predict repetition in short strings of performance - they believe that a series of stock price rises will be followed by a fall => gambler’s fallacy
also over longer strings => overinference: they believe that along sequence implies a trend, overreaction to announcements
Burks et al. (2013)
What are the mechanisms generating overconfidence?
Test 3 theories:
- Uncertainty and Bayesian updating
- Self-image concerns
i. Consumption Value: positive self-image as a good in itself
ii. Motivation value: optimistic assessments - optimistic, less likely to ask info - Signalling value => the best explanation: positive self-confidence makes a positive external representation easier
Procedure:
- Self-reported assessment how well they would do
- IQ test or Numeracy test
- Post-test self-assessment
- Asked if would like to know results.
Beliefs:
Above-average bias: most people put themselves in the top 40% of distribution, esp. in IQ => strong overconfidence
1st explanation: Overconfident people should be seeking less info: uncertainty as an explanation is quickly dismissed
2nd category of self-image: in a threshold-type model, once individuals are certain they are of high ability, they stop seeking info=> generates over-confident beliefs but Burks et al find that optimistic individuals are more likely to demand information.
3rd: social explanation for over-confidence: social signalling. people are strategically manipulating how confident they say they are, have a preference for others to think they are more confident
Burks et al. (2013) measures and results
Three measures for making predictions:
- Social potency scale= how much an individual likes to dominate and influence others and derives pleasure from being in the limelight
- Social closeness scale= general desire to be connected to others
- Stress reaction scale= related to the costs of overconfidence, how sensitive people are to the judgements of others e.g. being criticised for being overconfident
Results:
Individuals are overconfident because they care about the opinions of others
see more for advertising: consumers think they will pay low rate but pay high
Hard-easy effect
People tend to be overconfident in their ability to predict events when they have poor information.
people who asked easy questions tend to be under confident - hard-easy effect
overestimation of precision of information leads investors to trade too much
Overconfidence in the precision of information coupled with self-attribution bias may explain short-term positive correlation in returns (momentum) and long-term negative correlation
=hard-easy effect predicts overconfidence