Misinformation Flashcards
misinformation
incorrect beliefs held confidently
Explain misinformation and the Iraq War
Eg. people confidently believed (falsely) that Iraq had weapons WMDs, influencing support for the Iraq War
However, he Sadam Hussein didn’t have money or resources to do this
Used as justification for starting the war and fueled rage, even after it was proven incorrect (directional motivations)
2 motivational causes for misinformation
- directional motivations
- accuracy motivations
directional motivations
people seek information aligning with their political identity or preexisting views
includes confirmation bias
Confirmation Bias
the tendency to look for information that supports our preexisting beliefs and discard conflicting information
accuracy motivations
people genuinely seek correct information
relationship between directional and accuracy motivations
Directional motivations often overpower accuracy motivations
Backfire Effect
Corrections can backfire, causing people to cling more strongly to their initial false beliefs
backire effect evolutionary explanation
we perceive correction as an attack on our tribe, so people pushback in defense to reinforce our tribal connection
cognitive dissonance
the impossiblity of two things being true, which can cause people to find way to reinforce their beliefs
Factors effecting correction effectiveness
- issue type
- source of correction
Issue type
-Issues deeply tied to identity or partisan worldview are harder to correct
Correcting misinformation on the highly partisan issue of immigration is more challenging
-Less politicized are non-central issues are easier to correct
E.g. If you found out that Blake Lively attacks were planned and don’t care for her, you views of her may change as more information is revealed
Source of correction
Corrections from sources with unexpected or non-partisan credibility are most effective
E.g. Republicans calling out misinformation in Trump’s State of the Union speech
Scholars and journalists increasingly being attacked because they try to correct misinformation but it’s interpreted as attacking the other
measurement challenges
difficulty distinguishing between sincere misinformation and expressive responding (intentionally reporting false beliefs to express partisan identity
E.g. Trump supporters inaccurately claimed that his inauguration had larger crowd than Obama’s despite clear photographic evidence
Samuel Woolley “manufacturing consensus”
-uses computation comm strategies to analyze how misinformation spreads
-Chapter focuses on a case study of India by examining the Prime Minister Narendra Modi by examining information that supported his party at the expense of truth
computational propaganda
Use of automated digital tools to manipulate public opinion and manufacture consensus
E.g. producers at superbowl trying to regain audience by amplifying wild and enthusiastic cheering at Michael Jackson’s half time performance to create the illusion of mass consensus and support for him
misinformation and disinformation thrive in closed communication systems facilitated by political operatives using:
-automation (bots)
-anonymous profiles (e.g. for amazon reviews)
-real human influencers
How did the BJP employ computational propaganda
BJP operative use WhatsApp groups and political bots to amplify pro-Modi messaging and drive racial partisan animosity that led to lynchings
Biased political messages made to appear organically popular but used digital technologies to create illusion of consensus
relational advertising
leverages personal trust relationships (family/friends) to amplify propaganda
people trust friends and family more
How did BJP utilize relational advertising?
WhatsApp groups facilitate misinformation spread because users trust information from close contacts more than traditional media
Chaos as a strategic tool
Propagandists intentionally generate confusion rather than clear narratives, as confusion itself serves their political objectives
how did the BJP utilize chaos as a strategic tool
BJP WhatsApp campaigns created conflicting narratives which led to
-Uncertainty about whose right and wrong
-Mistrust in traditional media
-Weakened critical capacities among the public (don’t have time to investigate thoroughly so people rely on those they trust)
manufacturing consensus meaning
Creating the illusion of widespread public support or opposition through artificial digital means
E.g. in the 2016 US election, Trump’s campaign benefited significantly from automated bots artificially inflating his online popularity on Twitter – People start to believe information that they see very often
Manufacturing consent: five filters explaining how elite interests manipulate mass media to control public opinion
- ownership
- advertising
- sourcing
- flak
- ideological fear
ownership
Most major media outlets are owned by large corporations who are part of the economic elite- companies avoid content that challenge the owners or threaten profits or corporate power
ownership influences the stories that do or don’t run
advertising
Media outlets depend on advertising to survive
Customers are the advertisers and to keep them happy the media has to avoid or soften stories that offend advertisers
E.g, a pharmaceutical company advertisers in your paper but there’s a study that the drug causes major side effects - this story may not be run
sourcing
News orgs have to rely on official sources for information, which have a lot of information
Flak
-Negative responses to media content
-Powerful groups can generate flak when media publishes stories that are harmful to them
-Fear of backlash can cause self-censorship and lack of discussion on controversial topics (e.g. with anti-vaccine propagandists)
*Ideological fear
By promoting fear, media can justify the actions of those in power - critics can then be painted as disloyal
E.g. anti-communism, anti-terrorism, anit-immigration
Digital Astroturfing
Creating the illusion of grassroots organizations and movements, using digital tools such as bots and sockpuppets
“Distraction, not Partisanship Drives Sharing of Misinformation” Reading
-examines why misinformation spreads online
-finds that misinformation isn’t so much spread by inability to recognize false info, as it is by distraction
-People can typically identify misinformation accurately but often share it anyway due to inattention rather than strong partisan motives
what does idelogy impact and not impact?
-Ideology only modestly influences individuals’ judgments of accuracy
-Ideology significantly influences the decision to share content (headlines aligns with political affiliation)
disconnect between accuracy and behavior
Participants acknowledged the importance of sharing accurate info but still shared misinformation
-stated beliefs differ from actual behaviors
cognitive prompts
simple interventions reminding users to evaluate accuracy
-found to significantly reduce sharing misinformation
e.g. saying at the beginning of an article that it is seven years old
“People share outrageous news even when they know it’s false” reading findings
People can often tell that the information is false but will share it anyways if it triggers moral outrage
Experiments showed participants could distinguish true from false news but would still share false news if emotionally aroused
Social functions of outrage
-Sharing outrageous misinformation helps individuals signal moral or social alignment
-Emotional arousal provides social rewards - status, identity confirmation