Lecture 8 Flashcards
ethical persuasion concerns
- manipulation
- coercive/predatory
- misinformation
- intrusiveness
- microtargeting
potential persuasion issues
failure to communicate novel, important information
marketing and advertising issues
intrusive, disruptive TV, radio, online ads, deceptive advertising, targeting specific groups
political persuasion issues
- propaganda
- mass media
- polarization
- disinformation
health/safety persuasion concerns
- disinformation
- failure to communicate important information
how easily are people persuaded?
- gullibiligy: accept too much, too easily persuaded
- conservatism: reject too much information, fail to update
Mercier (2017)
strong gullibility?
- widespread and frequent
- costly (damaging rituals, costly purchases, harmful inaction, risky action, etc.)
- due to sources (authority figures vs content of messages)
- people evolved to resist being taken advantage of in communication
- espcially hard to persuade for costly behavior, counter-intuitive arguments
- people accept content that fits with prior views more than deference to source
- with enviroment, technology, new susceptibilities may emerge
psychological mechanisms of persuasion
- plausibility checking
- trust calibration
- reasoning
plausibility checking
using prior beliefs to interpret new information or messages, only believing/incorporating if plausible
- inconsistency between new experienced info and prior beliefs
-> update
- inconsistency between message from another person and prior beliefs -> evaluate source trustworthiness and reason -> updating
- more extreme/new information -> more thorough checking
- too much conservatism? fail to update for non-intuitive information? harder to spread?
trust calibration
evaluate trustworthiness via 1) cues of competence and benevolence of source, 2) commitment tracking
- more willing to accept implausible information from competent and benevolent source, majority
- competence: intelligence, expertise, direct experience
- benevolence: care of others interests, shared interests, affiliation, similarity
commitment tracking
calibrate trust according to source confidence and reliability
reasoning
evaluate argument strength
- (recall ELM): if involved, relevant, high stakes, etc., people evaluate arguments thoroguhly via central route/high elaboration
- only if low relevance, unimportant, low stakes, people take peripheral route/low elaboration and rely on cues
lapses of judgement leading to gullibility
- people buy things they don’t need/want/etc.
- some people do enter misinformation rabbit holes or join cults
- costliness to self-matters, but costliness to others may not trigger strong reaction (e.g. government cutting foreign aid has large impact on well-being, but not self)
- leads to over-conservatism and failure to incorporate new, accurate information in the future
why resist?
- accuracy motives: desire to have correct information, avoid deception
- defense motives: self-consistency, reduce conflict, reluctance to change
- freedom motives: reactance to threat to freedom
- social motives: cohesion with in-group
accuracy motives
desire to have correct information, avoid deception
- desire to maintain own beliefs as correct/truthful (may lead to confirmation bias)
- previous negative experiences with persuasion (ads) increases skepticism for all ads
- knowledge of persuasion strategies may trigger skepticism
defense motives
- defense motives: self-consistency, reduce conflict, reluctance to change
- deisire to maintain important, self-relevant beliefs
- perceive more risks than benefits
- satisfaction with current situation
freedom motives
reactance to threat of freedom
- persuasion threatens freedom to: display attitude or behavior, change attitude or behavior, avoid committing to a position
- perception of threat to freedom can occur: if persuasive intent is perceived, even if not counter-attitudinal, for own well-being, if forceful, intensive, assertive, direct request, if guilt-inducing
- behave or shift attitude to contradict persuasive message, boomerang
strategies to resist
- avoidance strategies: selective processing (exposure and memory)
- biased processing strategies: weighting, reducing impact, optimism bias
- contesting strategies: counterarguing, derogation of source/content/persuasive attempt
- empowerment strategies: attitude bolstering, social validation, self-assertion
avoidance
avoid persuasion altogether
- physical: e.g. leaving room during ads, avoid sales representatives
- mechanical: use tool to avoid e.g. switch channel, fast-forward, ad-blockers
- cognitive: ignore, reduce attention, e.g. avoid looking at banner ads
- more likley to avoid informational than emotional/entertaining messages
- happens before persuasion attempt
avoidance: selective exposure
avoid information that contradicts prior attitudes/belifes and seek information that is aligned
- avoid conflict, avoid cognitive dissonance, confirmation bias
avoidance: selective memory
remember certain information and not others
- usually self-consistent, fluent, accessibile, bolster attitudes
- forget counter-attitudinal information
biased processing
- weight attributes: put more weight on attitude-consistent information, less on inconsistent information, to support original attitude
- reduce impact: reduce relevance of message to self by isolating counter-attitudinal information to avoid impact on broader attitude
- use knowledge: high knowledge -> stronger attitude -> more resistant to persuasion, better counter-arguing
- optimism bias: assume negatve outcomes will not happen to self
contesting
- challenge or derogate the message
- source: undermine credibility
- content: undermine credibility, relevance, precision (exaggeration)
- persuasive intent/strategy: undermine message based on bad intent
contesting, derogating the message source
derogating the message source: question credibility, expertise, trustworthiness of source
- less effort than counter arguing; based on cue rather than argument
- similar to Mercier’s ‘trust calibration’
contesting, derogating the message content
derogating the message content
- dismiss content as exaggeration, not credible, or irrelevant to self
- can be problematic in health/safety domains
counter arguing
- generate arguments as to why message is incorrect, strengthened by forewarning/inoculation
inoculation
exposing people to a weakened version of an argument against their beliefs can help them develop counterarguments and resist future persuasion
- similar to how vaccines work by exposing the body to a weakened virus to build immunity
contesting, derogating the message persuasive intent
derogating the persuasive intent/strategies:
- knowledge of persuasion strategies, such as emotional appeals, use of cute or attractive images can trigger resistance
- forewarning
- inoculation
- less common for narratives vs clear persuasive intent
empowerment
attitude bolstering: retrieve attitude and generate pro-attitude reasons before exposure
- makes attitude more accessible, potentially more coherent/consistent
- does not directly counter contradictory messages
- self assertion
- social validation
self assertion
reaffirm self-esteem, by self-confidence about attitudes
- reduce susceptibility to social pressure to conform
social validation
confirm attitude by thinking about how others who share that attitude
- use feedback from others to strengthen attitude
which strategies are triggered by which motives?
when to use different strategies
misinformation vs disinformation
misinformation: any incorrect information
disinformation: incorrect information with intent to mislead
cognitive drivers of misinformation
- familiarity: fluency, plausibility, coherence with prior beliefs increase acceptance of misinformation
- intuition: more time to think, justify choices can override intuition and reduce susceptibility
socio-affective drivers
- source cues: credibility, attractiveness, etc.
- emotion: emotional cues, harm, moral outrage, anger, or happy mood
- worldview: ideology, political leaning, and group membership matter
cognitive barriers to correction of misinformation
- memories cannot be erased/overridden
- corrections must be integrated with original misinformation in memory so both get retrieved
- memories may be triggered with or without correction, whereas correction is only activated with misinformation, making it relatively weaker
socio-affective barriers to correction of misinformation
- source cues matter for misinformation and correction
- emotional misinformation may create stronger memories
- correction that threatens worldview may backfire (defense motives)
policy interventions of misinformation
- fact checking: flag inaccuracy for specific misinformation with wide reach, harmful consequences (but unflagged info may be perceived as more likely to be true)
- shift algorithm to reduce virality of misinformation (but false positives and false negatives)
psychological interventions of misinformation
- logic corrections: address general logical fallacies in misinformation
- debunking: correct specific misinformation after exposure, explain why false, and offer alternative explanation
- pre-bunking: warning of potential misinformation, pre-emptive correction
misinformation, when to refute
- pre-bunk: indication of accuracy in advance
- labeling: indication of accuracy with info
- debunk: indication of accuracy after seeing info
Brashier et al., 2012
- participants rate false headlines as false more often in the debunk and label group
- participants rate true headlines as true more often in the debunk group, but all pre-bunk, label, and debunk outperform control
- debunking may work better for specific facts
inoculation theory
- analogous to medical vaccine, but for misinformation
- can target broader spectrum of issues, generalize, whereas other factual corrections are often content-specific
bad news game
players act as disinformation creators and learn different methods or strategies of disinformation
- strategies taught: impersonation, emotion, polarization, conspiracy, discredit, trolling
Roozenbeek et al., 2022
- participants rate misinformation as less reliable after playing bad news game
- participant rate real news as less reliable after playing
- participants rate misinformation as even less reliable than real news (more discernment)
inoculation theory applications
applications to misinformation domains
- health (COVID)
- climate change
- conspiracy thoeries
persistence: effect decays over months, but ~3 month ‘booster’ shots with additional training show stable effects