Misinformation Flashcards

1
Q

misinformation

A

incorrect beliefs held confidently

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain misinformation and the Iraq War

A

Eg. people confidently believed (falsely) that Iraq had weapons WMDs, influencing support for the Iraq War

However, he Sadam Hussein didn’t have money or resources to do this
Used as justification for starting the war and fueled rage, even after it was proven incorrect (directional motivations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

2 motivational causes for misinformation

A
  1. directional motivations
  2. accuracy motivations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

directional motivations

A

people seek information aligning with their political identity or preexisting views

includes confirmation bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Confirmation Bias

A

the tendency to look for information that supports our preexisting beliefs and discard conflicting information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

accuracy motivations

A

people genuinely seek correct information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

relationship between directional and accuracy motivations

A

Directional motivations often overpower accuracy motivations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Backfire Effect

A

Corrections can backfire, causing people to cling more strongly to their initial false beliefs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

backire effect evolutionary explanation

A

we perceive correction as an attack on our tribe, so people pushback in defense to reinforce our tribal connection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

cognitive dissonance

A

the impossiblity of two things being true, which can cause people to find way to reinforce their beliefs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Factors effecting correction effectiveness

A
  1. issue type
  2. source of correction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Issue type

A

-Issues deeply tied to identity or partisan worldview are harder to correct
Correcting misinformation on the highly partisan issue of immigration is more challenging

-Less politicized are non-central issues are easier to correct
E.g. If you found out that Blake Lively attacks were planned and don’t care for her, you views of her may change as more information is revealed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Source of correction

A

Corrections from sources with unexpected or non-partisan credibility are most effective
E.g. Republicans calling out misinformation in Trump’s State of the Union speech

Scholars and journalists increasingly being attacked because they try to correct misinformation but it’s interpreted as attacking the other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

measurement challenges

A

difficulty distinguishing between sincere misinformation and expressive responding (intentionally reporting false beliefs to express partisan identity

E.g. Trump supporters inaccurately claimed that his inauguration had larger crowd than Obama’s despite clear photographic evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Samuel Woolley “manufacturing consensus”

A

-uses computation comm strategies to analyze how misinformation spreads
-Chapter focuses on a case study of India by examining the Prime Minister Narendra Modi by examining information that supported his party at the expense of truth

17
Q

computational propaganda

A

Use of automated digital tools to manipulate public opinion and manufacture consensus

E.g. producers at superbowl trying to regain audience by amplifying wild and enthusiastic cheering at Michael Jackson’s half time performance to create the illusion of mass consensus and support for him

18
Q

misinformation and disinformation thrive in closed communication systems facilitated by political operatives using:

A

-automation (bots)
-anonymous profiles (e.g. for amazon reviews)
-real human influencers

19
Q

How did the BJP employ computational propaganda

A

BJP operative use WhatsApp groups and political bots to amplify pro-Modi messaging and drive racial partisan animosity that led to lynchings

Biased political messages made to appear organically popular but used digital technologies to create illusion of consensus

20
Q

relational advertising

A

leverages personal trust relationships (family/friends) to amplify propaganda

people trust friends and family more

21
Q

How did BJP utilize relational advertising?

A

WhatsApp groups facilitate misinformation spread because users trust information from close contacts more than traditional media

22
Q

Chaos as a strategic tool

A

Propagandists intentionally generate confusion rather than clear narratives, as confusion itself serves their political objectives

23
Q

how did the BJP utilize chaos as a strategic tool

A

BJP WhatsApp campaigns created conflicting narratives which led to
-Uncertainty about whose right and wrong
-Mistrust in traditional media
-Weakened critical capacities among the public (don’t have time to investigate thoroughly so people rely on those they trust)

24
Q

manufacturing consensus meaning

A

Creating the illusion of widespread public support or opposition through artificial digital means

E.g. in the 2016 US election, Trump’s campaign benefited significantly from automated bots artificially inflating his online popularity on Twitter – People start to believe information that they see very often

25
Q

Manufacturing consent: five filters explaining how elite interests manipulate mass media to control public opinion

A
  1. ownership
  2. advertising
  3. sourcing
  4. flak
  5. ideological fear
26
Q

ownership

A

Most major media outlets are owned by large corporations who are part of the economic elite- companies avoid content that challenge the owners or threaten profits or corporate power

ownership influences the stories that do or don’t run

27
Q

advertising

A

Media outlets depend on advertising to survive
Customers are the advertisers and to keep them happy the media has to avoid or soften stories that offend advertisers

E.g, a pharmaceutical company advertisers in your paper but there’s a study that the drug causes major side effects - this story may not be run

28
Q

sourcing

A

News orgs have to rely on official sources for information, which have a lot of information

29
Q

Flak

A

-Negative responses to media content
-Powerful groups can generate flak when media publishes stories that are harmful to them
-Fear of backlash can cause self-censorship and lack of discussion on controversial topics (e.g. with anti-vaccine propagandists)

30
Q

*Ideological fear

A

By promoting fear, media can justify the actions of those in power - critics can then be painted as disloyal

E.g. anti-communism, anti-terrorism, anit-immigration

31
Q

Digital Astroturfing

A

Creating the illusion of grassroots organizations and movements, using digital tools such as bots and sockpuppets

32
Q

“Distraction, not Partisanship Drives Sharing of Misinformation” Reading

A

-examines why misinformation spreads online
-finds that misinformation isn’t so much spread by inability to recognize false info, as it is by distraction
-People can typically identify misinformation accurately but often share it anyway due to inattention rather than strong partisan motives

33
Q

what does idelogy impact and not impact?

A

-Ideology only modestly influences individuals’ judgments of accuracy
-Ideology significantly influences the decision to share content (headlines aligns with political affiliation)

34
Q

disconnect between accuracy and behavior

A

Participants acknowledged the importance of sharing accurate info but still shared misinformation

-stated beliefs differ from actual behaviors

35
Q

cognitive prompts

A

simple interventions reminding users to evaluate accuracy

-found to significantly reduce sharing misinformation

e.g. saying at the beginning of an article that it is seven years old

36
Q

“People share outrageous news even when they know it’s false” reading findings

A

People can often tell that the information is false but will share it anyways if it triggers moral outrage

Experiments showed participants could distinguish true from false news but would still share false news if emotionally aroused

37
Q

Social functions of outrage

A

-Sharing outrageous misinformation helps individuals signal moral or social alignment
-Emotional arousal provides social rewards - status, identity confirmation