Behavioral Science Concepts Flashcards
Affect heuristic
The affect heuristic represents a reliance on good or bad feelings experienced in relation to a stimulus. Affect-based evaluations are quick, automatic, and rooted in experiential thought that is activated prior to reflective judgments (see dual-system theory) (Slovic, Finucane, Peters, & MacGregor, 2002). For example, experiential judgments are evident when people are influenced by risks framed in terms of counts (e.g. “of every 100 patients similar to Mr. Jones, 10 are estimated to commit an act of violence”) more than an abstract but equivalent probability frame (e.g. “Patients similar to Mr. Jones are estimated to have a 10% chance of committing an act of violence to others”) (Slovic, Monahan, & MacGregor, 2000). Affect-based judgments are more pronounced when people do not have the resources or time to reflect. Instead of considering risks and benefits independently, individuals with a negative attitude towards nuclear power may consider its benefits as low and risks as high, thereby leading to a more negative risk-benefit correlation than would be evident under conditions without time pressure (Finucane, Alhakami, Slovic, & Johnson, 2000). The affect heuristic has been used as a possible explanation for a range of consumer judgments, including the zero price effect, and it is considered another general purpose heuristic similar to availability and representativeness in the sense that affect serves as an orienting mechanism akin to similarity and memorability (Kahneman and Frederick, 2002).
Altruism
According to neoclassical economics, rational beings do whatever they need to in order to maximize their own wealth. However, when people make sacrifices to benefit others without expecting a personal reward, they are thought to behave altruistically (Rushton, 1984). Common applications of this pro-social behavior include volunteering, philanthropy, and helping others in emergencies (Piliavin & Charng, 1990).
Altruism is evident in a number of research findings, such as dictator games. In this game, one participant proposes how to split a reward between himself and another random participant. While some proposers (dictators) keep the entire reward for themselves, many will also voluntarily share some portion of the reward (Fehr & Schmidt, 1999).
While altruism focuses on sacrifices made to benefit others, similar concepts explore making sacrifices to ensure fairness (see inequity aversion and social preferences).
Ambiguity (uncertainty) aversion
Ambiguity aversion, or uncertainty aversion, is the tendency to favor the known over the unknown, including known risks over unknown risks. For example, when choosing between two bets, we are more likely to choose the bet for which we know the odds, even if the odds are poor, than the one for which we don’t know the odds.
This aversion has gained attention through the Ellsberg Paradox (Ellsberg, 1961). Suppose there are two bags each with a mixture of 100 red and black balls. A decision-maker is asked to draw a ball from one of two bags with the chance to win $100 if red is drawn. In one bag, the decision-maker knows that exactly half of the pieces are red and half are black. The color mixture of pieces in the second bag is unknown. Due to ambiguity aversion, decision-makers would favor drawing from the bag with the known mixture than the one with the unknown mixture (Ellsberg, 1961). This occurs despite the fact that people would, on average, bet on red or black equally if they were presented with just one bag containing either the known 50-50 mixture or a bag with the unknown mixture.
Ambiguity aversion has also been documented in real-life situations. For example, it leads people to avoid participating in the stock market, which has unknown risks (Easley & O’Hara, 2009), and to avoid certain medical treatments when the risks are less known (Berger, et al., 2013).
Anchoring (heuristic)
Anchoring is a particular form of priming effect whereby initial exposure to a number serves as a reference point and influences subsequent judgments about value. The process usually occurs without our awareness (Tversky & Kahneman, 1974). One experiment asked participants to write down the last three digits of their phone number multiplied by one thousand (e.g. 678 = 678,000). Results showed that people’s subsequent estimate of house prices were significantly influenced by the arbitrary anchor, even though they were given a 10 minute presentation on facts and figures from the housing market at the beginning of the study. In practice, anchoring effects are often less arbitrary, as evident the price of the first house shown to us by a real estate agent may serve as an anchor and influence perceptions of houses subsequently presented to us (as relatively cheap or expensive). Anchoring effects have also been shown in the consumer packaged goods category, whereby not only explicit slogans to buy more (e.g. “Buy 18 Snickers bars for your freezer”), but also purchase quantity limits (e.g. “limit of 12 per person”) or ‘expansion anchors’ (e.g. “101 uses!”) can increase purchase quantities (Wansink, Kent, & Hoch, 1998).
Availability heuristic
Availability is a heuristic whereby people make judgments about the likelihood of an event based on how easily an example, instance, or case comes to mind. For example, investors may judge the quality of an investment based on information that was recently in the news, ignoring other relevant facts (Tversky & Kahneman, 1974). Similarly, it has been shown that individuals with a greater ability to recall antidepressant advertising estimate the prevalence of depression to be higher than those with low recall (An, 2008), while less knowledgeable consumers use the ease with which they can recall low-price products as a cue to make judgments about overall store prices (Ofir, Raghubir, Brosh, Monroe, & Heiman, 2008). The availability of information in memory also underlies the representativeness heuristic.
Bounded rationality
Bounded rationality is a concept proposed by Herbert Simon that challenges the notion of human rationality as implied by the concept of homo economicus. Rationality is bounded because there are limits to our thinking capacity, available information, and time (Simon, 1982). Bounded rationality is similar to the social-psychological concept that describes people as “cognitive misers” (Fiske & Taylor, 1991) and represents a fundamental idea about human psychology that underlies behavioral economics. (See also satisficing.)
Certainty/possibility effects
Changes in the probability of gains or losses do not affect people’s subjective evaluations in linear terms (see also prospect theory and zero price effect) (Tversky & Kahneman, 1981). For example, a move from a 50% to a 60% chance of winning a prize has a smaller emotional impact than a move from a 95% chance to a 100% (certainty) chance. Conversely, the move from a 0% chance to a 5% possibility of winning a prize is more attractive than a change from 5% to 10%, for example. People over-weight small probabilities, which explains lottery gambling—a small expense with the possibility of a big win.
Choice architecture
This term was coined by Thaler and Sunstein (2008) and refers to the practice of influencing choice by changing the manner in which options are presented to people. For example, this can be done by setting defaults, framing, or adding decoy options.
Choice overload
Also referred to as ‘overchoice’, the phenomenon of choice overload occurs as a result of too many choices being available to consumers. Choice overload may refer to either choice attributes or alternatives. The application of heuristics in decision making becomes more likely with a greater number or complexity of choices. Overchoice has been associated with unhappiness (Schwartz, 2004), decision fatigue, going with the default option, as well as choice deferral—avoiding making a decision altogether, such as not buying a product (Iyengar & Lepper, 2000). Choice overload can be counteracted by simplifying choice attributes or the number of available options (Johnson et al., 2012).
Cognitive bias
A cognitive bias (e.g. Ariely, 2008) is a systematic (non-random) error in thinking, in the sense that a judgment deviates from what would be considered desirable from the perspective of accepted norms or correct in terms of formal logic. The application of heuristics is often associated with cognitive biases, some of which, such as those arising from availability or representativeness, are ‘cold’ in the sense that they do not reflect a person’s motivation and are instead the result of errors in information processing. Other cognitive biases, especially those that have a self-serving function (e.g. optimism bias), are more motivated. Finally, some biases, such as confirmation bias, can be motivated or unmotivated (Nickerson, 1998).
Cognitive dissonance
Cognitive dissonance, an important concept in social psychology (Festinger, 1957), refers to the uncomfortable tension that can exist between two simultaneous and conflicting ideas or feelings—often as a person realizes that s/he has engaged in a behavior inconsistent with the type of person s/he would like to be, or be seen publicly to be. According to the theory, people are motivated to reduce this tension by changing their attitudes, beliefs, or actions. For example, smokers may rationalize their behavior by holding ‘self-exempting beliefs’, such as “The medical evidence that smoking causes cancer is not convincing” or “Many people who smoke all their lives live to a ripe old age, so smoking is not all that bad for you” (Chapman et al., 1993). Arousing dissonance can be used to achieve behavioral change; one study (Dickerson et al., 1992), for instance, made people mindful of their wasteful water consumption and then made them urge others (publicly commit) to take shorter showers. Subjects in this ‘hypocrisy condition’ subsequently took significantly shorter showers than those who were only reminded that they had wasted water or merely made the public commitment.
Commitment
Commitments (see also precommitment) are often used as a tool to counteract people’s lack of willpower and to achieve behavior change, such as in the areas of dieting or saving—the greater the cost of breaking a commitment, the more effective it is (Dolan et al., 2010). From the perspective of social psychology, individuals are motivated to maintain a consistent and positive self-image (Cialdini, 2008), and they are likely to keep commitments to avoid reputational damage and/or cognitive dissonance (Festinger, 1957). The behavior change technique of ‘goal setting’ is related to making commitments (Strecher et al., 1995), while reciprocity involves an implicit commitment.
Confirmation bias
Confirmation bias occurs when people seek out or evaluate information in a way that fits with their existing thinking and preconceptions. The domain of science, where theories should advance based on both falsifying and supporting evidence, has not been immune to bias, which is often associated with people trying to bolster existing attitudes and beliefs. For example, a consumer who likes a particular brand and researches a new purchase may be motivated to seek out customer reviews on the internet that favor that brand. Confirmation bias has also been related to unmotivated processes, including primacy effects and anchoring, evident in a reliance on information that is encountered early in a process (Nickerson, 1998).
Control premium
In behavioral economics, the control premium refers to people’s willingness to forego potential rewards in order to control (avoid delegation) of their own payoffs. In an experiment, participants were asked to choose whether to bet on another person or themselves answering a quiz question correctly. Although individuals’ maximizing their rewards would bet on themselves in 56% of the decisions (based on their beliefs), they actually bet on themselves 65% of the time, suggesting an aggregate control premium of almost 10%. The average study participant was willing to sacrifice between 8 and 15% of expected earnings to retain control (Owens et al., 2014). (See also overconfidence.)
Decision fatigue
There are psychological costs to making decisions. Since choosing can be difficult and requires effort like any other activity, long sessions of decision making can lead to poor choices. Similar to other activities that consume resources required for executive functions, decision fatigue is reflected in self-regulation, such as a diminished ability to exercise self-control (Vohs et al., 2008). (See also choice overload and ego depletion.)
Decision staging
When people make complex or long decisions, such as buying a car, they tend to successively explore their options. This includes what information to focus on, as well as choices between attributes and alternatives. For example, when people narrow down their options, they often tend to screen alternatives on the basis of a subset of attributes and then compare alternatives. Choice architects may not only break down complex decisions into multiple stages to make the process easier, they can also work with an understanding of successive decision making by facilitating certain comparisons at different stages of the choice process (Johnson et al., 2012).
Decoy effect
Choices often occur relative to what is on offer rather than based on absolute preferences. The decoy effect is technically known as an ‘asymmetrically dominated choice’ and occurs when people’s preference for one option over another changes as a result of adding a third (similar but less attractive) option. For example, people are more likely to choose an elegant pen over $6 in cash if there is a third option in the form of a less elegant pen (Bateman, Munro, & Poe, 2008).
Default (option/setting)
Default options are pre-set courses of action that take effect if nothing is specified by the decision maker (Thaler & Sunstein, 2008), and setting defaults is an effective tool in choice architecture when there is inertia or uncertainty in decision making (Samson, 2014). Requiring people to opt-out if they do not wish to donate their organs, for example, has been associated with higher donation rates (Johnson & Goldstein, 2003).
Disposition effect
The disposition effect refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains (Shefrin & Statman, 1985). This phenomenon can be explained by prospect theory (loss aversion), regret avoidance and mental accounting.