Thinking fast and slow Flashcards
Availability heuristic
This is a mental shortcut that relies on immediate examples that come to mind when evaluating a specific topic, concept, method, or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important. However, this can lead to biased judgments because our memories are often influenced by various factors.
Elementary example: If you see news reports about car thefts in your city, you might estimate that the likelihood of your car being stolen is greater than it really is, because the thefts are more “available” in your memory.
Real-world example: People tend to overestimate the likelihood of sensational events such as plane crashes or terrorist attacks because they are often highlighted in the media, making them readily “available” in our minds.
Representativeness heuristic
This heuristic is used when we estimate the likelihood of an event by comparing it to an existing prototype in our minds. This can often lead us to disregard important base rate information.
Elementary example: If you meet a man who wears glasses and loves reading, you might assume that he is a librarian rather than a farmer, because he is more “representative” of the stereotype of a librarian.
Real-world example: Let’s consider the scenario of meeting someone at a technology conference who is shy and introverted. You might immediately think they’re more likely to be a software engineer rather than a salesperson, because the person’s demeanor “represents” the common stereotype of engineers. However, in reality, there’s no reason a salesperson couldn’t be shy or introverted, and making assumptions based on limited information could lead to inaccurate conclusions.
In this case, the representativeness heuristic could cause us to overlook other relevant information (such as the person’s actual job title, skills, or past experience) and make an erroneous judgment based on a stereotype. This mental shortcut can occur because it’s easier and quicker for our brains to categorize people and things based on prominent or ‘representative’ features.
Anchoring effect
This bias occurs when individuals depend too heavily on an initial piece of information offered (considered to be the “anchor”) when making decisions. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth.
Elementary example: If a store marks an item as “reduced from $100 to $50”, you might see it as a bargain. The $100 price serves as an anchor, making the $50 price seem cheap in comparison.
Real-world example: In salary negotiations, whoever makes the first offer establishes a range of reasonable possibilities in each person’s mind. Any counteroffer will naturally be anchored by that opening bid.
Framing effect
The framing effect influences how people perceive the risk associated with a particular decision. People are generally risk-averse when a choice is framed positively but become risk-seeking when the same choice is framed negatively.
Impact of Wording: The way information is worded or presented can significantly impact decision-making. Even when the underlying information is the same, the framing can lead to different choices.
The key to understanding the framing effect in this context is recognizing that people tend to avoid risks when the outcome is framed positively because the positive framing highlights the benefits or gains that can be secured without risk.
Conversely, when an outcome is framed negatively, highlighting what can be lost, people may become more inclined to take risks to avoid that loss.
Medical Decision Making:
Positive Frame: “This surgery has a 90% success rate.”
Negative Frame: “There is a 10% chance of complications in this surgery.”
Despite the odds being the same, patients are more likely to opt for surgery when presented with the positive frame (success rate) than the negative one
Financial Decisions:
Positive Frame: “This investment has a 70% chance of making a profit.”
Negative Frame: “This investment has a 30% chance of losing money.”
Investors might be more inclined to invest when the chance of profit is highlighted rather than the chance of loss.
Consumer Behavior:
Positive Frame: “Save $10 on your purchase today!”
Negative Frame: “Don’t lose your $10 discount on your purchase today!”
Consumers might be more motivated by the negative framing, which emphasizes a potential loss.
Environmental Policy:
Positive Frame: “Switching to renewable energy sources can help preserve 80% of the endangered species.”
Negative Frame: “Failure to switch to renewable energy sources could result in the loss of 80% of endangered species.”
The negative frame might create a stronger incentive for action due to the aversion to loss.
This cognitive bias refers to the way people react differently to a particular choice depending on whether it is presented as a loss or a gain. People tend to avoid risk when a positive frame is presented but seek risks when a negative frame is presented.
Elementary example: If you tell people that a type of food has 90% fat-free content, they’re more likely to view it as healthy compared to saying it has a 10% fat content, even though both statements are technically true.
Real-world example: People are more likely to support an economic policy if the potential benefits are emphasized over potential costs, even if the costs and benefits are equivalent.
Confirmation bias
This is a tendency to search for, interpret, favor, and recall information in a way that confirms or strengthens one’s prior personal beliefs or hypotheses.
Elementary example: If you believe that left-handed people are more creative, you’re more likely to notice information that supports your belief and ignore information that contradicts it.
Real-world example: During an election, people often favor news sources that align with their political beliefs. They interpret these sources as more reliable and dismiss sources that contradict their views.
Prospect theory and an example
They proposed the Prospect Theory as an alternative to Expected Utility Theory, which had been the prevailing model of how people make risky decisions. Their experiments showed that people don’t make decisions based on the final outcome, but on the potential value of losses and gains. People are more sensitive to losses than equivalent gains (loss aversion), and their valuation of both decreases as the amount increases (diminishing sensitivity).
Elementary example: An experiment involving a coin flip. Given the choice between a sure gain of $30 and a 50% chance of gaining $60 (with a 50% chance of gaining nothing), most people choose the sure thing, even though the expected values are the same.
Real-world implication: This theory is used extensively in economics and finance, and it can help explain various economic behaviors. For instance, investors may hold onto losing stocks too long due to the pain of realizing a loss, even though selling might be the more rational decision.
WYSIATI (What You See Is All There Is) Experiment
This concept was demonstrated through various experiments. One such experiment asked participants to estimate the probability that a student, Tom W., who was described as having high intelligence and little creativity, was studying a particular field. Participants based their judgments entirely on the description of Tom, ignoring the statistical distribution of students in different fields. For instance they assumed he was a CS or engineer as opposed to a business student even though statistical likelihood of the latter is much higher. This illustrates that people often make judgments based on the information available to them, without considering what they might be missing.
Endowment Effect Experiment
This theory suggests that people place a higher value on objects simply because they own them. In an experiment, participants were given a mug and then offered the chance to sell it or trade it for an equally priced pen. Most chose to keep their mugs, even though they had no preference between the items beforehand.
Real-world implication: This effect can be seen in many aspects of economic behavior. For example, people often ask for more money to sell something they own than they would be willing to pay to buy it.
Priming Experiments
Priming refers to the effect by which exposure to a stimulus influences the response to a subsequent stimulus, without conscious guidance or intention. It involves implicit memory, meaning previous experiences can subconsciously trigger our behaviors and responses. It operates as part of our System 1 thinking, the fast, intuitive, and automatic mode of thought.
Elementary Example: In the experiment you mentioned, people were exposed to words related to old age and then they walked slower afterwards. This demonstrates how a stimulus (in this case, the words related to old age) can subconsciously influence behavior, causing the participants to walk slower even though they were not directly instructed to do so.
Real-World Implication: Priming effects can have wide-ranging implications, especially in fields like marketing and advertising. For instance, a commercial that shows happy, smiling people around a product primes viewers to associate the product with happiness. This could influence viewers to purchase the product in order to seek similar feelings of happiness. In our personal lives, the way someone phrases a question could prime us to respond in a certain way, affecting our decisions and judgments. Understanding the effects of priming can help us become more conscious of the factors influencing our decisions, and possibly help us mitigate unwanted influences.
Familiarity Principle/Mere-Exposure Effect
The Mere-Exposure Effect describes the phenomenon where people develop a preference for things merely because they are familiar with them. In social psychology, this effect is also known as the familiarity principle. It operates under System 1 thinking - our judgments are made quickly and automatically based on what is familiar to us.
Elementary Example: An experiment conducted by psychologist Robert Zajonc showed people a series of meaningless symbols. Some symbols were shown more frequently than others. When asked later which symbols they preferred, people consistently preferred the symbols they had seen more often, demonstrating the mere-exposure effect.
Real-World Implication: This psychological phenomenon plays a substantial role in many areas, including marketing and politics. For example, companies use repetitive advertising to familiarize consumers with their products or services, knowing that familiarity will make consumers more likely to choose their offering. Similarly, politicians often repeat key phrases or messages to increase public familiarity and approval. On a personal level, understanding this principle can help us be aware of how our preferences might be influenced by mere familiarity rather than objective assessment. It can encourage us to be more thoughtful and deliberate in our choices, invoking more of our System 2 thinking.
What have you learned about decision-making from the book so far?
Anchoring heuristic. It’s the tendency to rely heavily on the first piece of information we encounter (the “anchor”) when making decisions.
Elementary Example: Imagine you’re at a fair, and there’s a game where you need to guess the number of jelly beans in a jar. The game host mentions that the last winner guessed 1200. That number might unknowingly influence your guess, even if the jar looks like it could hold more or less. This is an example of anchoring, where your guess is “anchored” to the initial number you heard.
Real-World Example: In negotiations, the person who makes the first offer often has an advantage, because that number sets the anchor point for the rest of the negotiation. For instance, if you’re selling a car and you start with a higher price, the buyer’s counteroffers are likely to be higher than if you had started with a lower price.
Availability heuristic, where we base our decisions on information that is readily available to us, often ignoring statistical reality.
Elementary Example: If a child sees a news story about a plane crash, they might become afraid of flying, thinking it’s dangerous, despite the statistics that show flying is one of the safest modes of transport.
Real-World Example: In business, leaders might make decisions based on recent events or data that are readily available to them, while ignoring larger trends or historical data. This could lead to decisions that are reactive rather than strategic.
Framing Effect: How information is presented to us can significantly sway our decision.
Elementary Example: Imagine you have $5, and you can either keep $3 or lose $2. Even though the outcome is the same ($3), most people would opt to keep $3 because it is framed as a gain, whereas losing $2 is framed as a loss.
Real-World Example: In health communications, telling people that a medical procedure has a 90% survival rate (positive frame) will often elicit a more favorable response than saying it has a 10% mortality rate (negative frame), even though the statistical information is the same.
Overconfidence Bias: This is the tendency to overestimate our abilities or the precision of our predictions.
Elementary Example: A student might go into a test thinking they’ll get a 100% score because they studied the night before. However, they end up getting a lower score because they overestimated their ability to retain information and didn’t account for all the potential topics on the test.
Real-World Example: In the business world, a CEO might predict a high return on a new project, ignoring potential risks because of their overconfidence in the project’s success. This bias can lead to significant financial losses if the predictions don’t pan out.
Confirmation Bias: This is our tendency to seek out and interpret information in a way that confirms our preexisting beliefs, while ignoring or devaluing information that contradicts them.
Elementary Example: A child who believes that spiders are dangerous might only pay attention to scary stories about spiders and ignore information that suggests most spiders are harmless.
Real-World Example: In politics, people often seek out news sources that align with their political views. They’re more likely to believe information that supports their perspective and dismiss information that contradicts it. This can lead to a polarized understanding of issues and a reluctance to consider alternative viewpoints.