Cognitive Bias - Deck #1 Flashcards

1
Q

Attentional bias

A

Attentional bias is the tendency for people’s perception to be affected by their recurring thoughts at the time.[1] Attentional biases may explain an individual’s failure to consider alternative possibilities, as specific thoughts guide the train of thought in a certain manner.[2] For example, cigarette smokers tend to possess a bias for cigarettes and other smoking-related cues around them, due to the positive thoughts they’ve already attributed between smoking and the cues they were exposed to while smoking

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Sunk cost fallacy

A

the idea that a company or organization is more likely to continue with a project if they have already invested a lot of money, time, or effort in it, even when continuing is not the best thing to do:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Illusory correlation

A

An illusory correlation occurs when an individual imagines that a correlational relationship exists between data sets (usually with people, events, or behavior) when it really doesn’t.

An example of this could be looking at the relationship between washing your car and rainstorms. We all know intellectually that washing your car has no real effect on the frequency of rainstorms but it frequently seems that it rains shortly after washing your car.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Appeal to novelty

A

The appeal to novelty (also called argumentum ad novitatem) is a fallacy in which one prematurely claims that an idea or proposal is correct or superior, exclusively because it is new and modern. In a controversy between status quo and new inventions, an appeal to novelty argument is not in itself a valid argument. The fallacy may take two forms: overestimating the new and modern, prematurely and without investigation assuming it to be best-case, or underestimating status quo, prematurely and without investigation assuming it to be worst-case.

Examples:

  • “If you want to lose weight, your best bet is to follow the latest diet.”
  • “The department will become more profitable because it has been reorganized.”
  • “Upgrading all your software to the most recent versions will make your system more reliable.”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Anchoring

A

The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject).

Example:
If your visitor first encountered a competitor’s product priced at $49 per month (that’s their anchor), they’ll be less likely to accept your $69 per month price.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Availability Cascade

A

A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).

Example:
This goes hand-in-hand with attentional bias. The more people talk about your site in a positive way, the more likely they are to purchase from you. Conversely, the more people talk about your site in a negative way, the less likely they are to purchase from you.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Backfire effect

A

When people react to disconfirming evidence by strengthening their beliefs.

Example:
Your visitors believe what they believe and all of the factual evidence in the world won’t change their mind. Instead, you’ll need to rely on emotional persuasion. Appealing to rationality won’t change deeply held beliefs (i.e. Apple is better than PC, Slack is better than HipChat, Pages is better than Word, etc.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Bandwagon effect

A

The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.

Example:
If your visitor thinks everyone else is using your product or service, they’re more likely to use your product or service. That’s why creating scarcity and social proof are so effective.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Belief bias

A

An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.

Example:
When you make extraordinary claims, regardless of whether or not they are true, your visitors are less likely to purchase from you. If it sounds “too good to be true”, your visitors will believe that, well, it is too good to be true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Clustering illusion

A

The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).

Example:
You’re likely to spot trends where there are none, which will result in future tests and hypotheses based on false information. To avoid this, be sure you’ve calculated the correct sample size prior to beginning your test. Do not stop the test until you’ve reached that full sample size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Confirmation bias

A

The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.

Example:
In a way, this is similar to clustering illusion. Once you have an idea in your head, you subconsciously begin to seek out information that confirms that idea or belief. As an experimenter, this could mean a lot of wasted time on insignificant tests.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Contrast Effect

A

The enhancement or reduction of a certain perception’s stimuli when compared with a recently observed, contrasting object.

Example:
While you must meet “basic expectations”, you must also create contrast. Surprising stimuli cause the brain to slow down, focus on the stimuli and commit it to memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Curse of knowledge

A

When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.

Example:
You are not your visitors. You have become so familiar with your site that you can no longer use or view it the way a new visitor would. When redesigning to increase conversions, do not make the decisions yourself. Ask someone else, someone less informed, to think about the UX and design.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Empathy Gap

A

The tendency to underestimate the influence or strength of feelings, in either oneself or others.

Example:
You’re underestimating the role emotions play in decision-making (with yourself and others). We believe that we’re rational people making rational decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Framing Effect

A

Drawing different conclusions from the same information, depending on how that information is presented.

If you’re conducting qualitative research, the questions that you ask are subject to this effect. The way you ask a question can lead to very different results. Before publishing a survey or asking even a single question, ensure the language is clear and you are not leading the respondents to a certain answer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Post-Pruchase Rationalization

A

The tendency to persuade oneself through rational argument that a purchase was a good value.

Example:
Are you familiar with the term “buyer’s remorse”? In 2013, a survey found that over 50% of people often or sometimes feel buyer’s remorse. When it kicks in, your visitors persuade themselves, with rational arguments, into believing the purchase was a good idea.

17
Q

Unit Bias

A

The tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.

Consider how many bad movies and books you’ve finished simply because you started. Once you begin, you feel inclined to finish. This is why storytelling is so powerful. Structure your copy to include a beginning, a middle, and an end to improve readability and time on page.

18
Q

False consensus effect

A

The tendency for people to overestimate the degree to which others agree with them.

“No one clicks on popups.” “Everyone hates sliders.” Do these types of phrases sound familiar? We tend to believe that others think the same way we do, agreeing with us more often than not. This can negatively impact the way we design, write copy, etc.

19
Q

Context effect

A

That cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a work-related memory will be lower at home, and vice versa).

If you’re trying to play on your visitors’ past feelings or memories or behaviors, context is important. Those feelings, memories and behaviors are easier to recall when they’re in the same context they were previously. Use your copy and design to set the stage and make retrieving those feelings, memories and behaviors easier.

20
Q

Humor effect

A

That humorous items are more easily remembered than non-humorous ones, which might be explained by the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor.

Example:
What do Dollar Shave Club, Poopourri and this £8,999 urine-free wetsuit all have in common? They all used the humor effect to be more memorable and capture more attention. Think about how many times you say something like, “I saw the funniest thing…” Humor spreads like wildfire.

21
Q

Gambler’s fallacy

A

The gambler’s fallacy, also known as the Monte Carlo fallacy or the fallacy of the maturity of chances, is the mistaken belief that if something happens more frequently than normal during a given period, it will happen less frequently in the future (or vice versa).

22
Q

Naïve cynicism

A

Naïve cynicism is a philosophy of mind, cognitive bias and form of psychological egoism that occurs when people naïvely expect more egocentric bias in others than actually is the case.

23
Q

Illusion of validity

A

Illusion of validity is a cognitive bias in which a person overestimates his or her ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data “tell” a coherent story.[1][2]

Example:
Daniel Kahneman, Paul Slovic, and Amos Tversky explain the illusion as follows: “people often predict by selecting the output…that is most representative of the input….The confidence they have in their prediction depends primarily on the degree of representativeness…with little or no regard for the factors that limit predictive accuracy. Thus, people express great confidence in the prediction that a person is a librarian when given a description of his personality which matches the stereotype of librarians, even if the description is scanty, unreliable, or outdated. The unwarranted confidence which is produced by a good fit between the predicted outcome and the input information may be called the illusion of validity.”

24
Q

Law of the instrument

A

The concept known as the law of the instrument, otherwise known as the law of the hammer,[1] Maslow’s hammer (or gavel), or the golden hammer,[a] is a cognitive bias that involves an over-reliance on a familiar tool. As Abraham Maslow said in 1966, “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”

25
Q

Cherry picking

A

Cherry picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position while ignoring a significant portion of related cases or data that may contradict that position. It is a kind of fallacy of selective attention, the most common example of which is the confirmation bias.[1][2] Cherry picking may be committed intentionally or unintentionally. This fallacy is a major problem in public debate.[3]

Example:
In argumentation, the practice of “quote mining” is a form of cherry picking,[8] in which the debater selectively picks some quotes supporting a position (or exaggerating an opposing position) while ignoring those that moderate the original quote or put it into a different context. Cherry picking in debates is a large problem as the facts themselves are true but need to be put in context. Because research cannot be done live and is often untimely, cherry-picked facts or quotes usually stick in the public mainstream and, even when corrected, lead to widespread misrepresentation of groups targeted.

26
Q

Survivorship bias

A

Survivorship bias or survival bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways. It is a form of selection bias.

Survivorship bias can lead to overly optimistic beliefs because failures are ignored, such as when companies that no longer exist are excluded from analyses of financial performance. It can also lead to the false belief that the successes in a group have some special property, rather than just coincidence (correlation proves causality). For example, if three of the five students with the best college grades went to the same high school, that can lead one to believe that the high school must offer an excellent education. This could be true, but the question cannot be answered without looking at the grades of all the other students from that high school, not just the ones who “survived” the top-five selection process. Another example of a distinct mode of survivorship bias would be thinking that an incident was not as dangerous as it was because everyone you communicate with afterwards survived. Even if you knew that some people are dead, they wouldn’t have their voice to add to the conversation, leading to bias in the conversation.

27
Q

Selection bias

A

Selection bias is the bias introduced by the selection of individuals, groups or data for analysis in such a way that proper randomization is not achieved, thereby ensuring that the sample obtained is not representative of the population intended to be analyzed.[1] It is sometimes referred to as the selection effect. The phrase “selection bias” most often refers to the distortion of a statistical analysis, resulting from the method of collecting samples. If the selection bias is not taken into account, then some conclusions of the study may be false.