Lecture 11 Flashcards

1
Q

Artikel - Grinberg et al. (2019, Science) Twitter Data

A

1% of individuals accounted for 80% of fake news source exposures - 0.1% accounted for nearly 80% of fake news sources shared

More likely to be exposed & share:
Republicans
Older people
Politically interested users

Facebook Data (Guess et. al. (2019)
Only 8.5% shared stories from fake news sites
A few users account for most of the sharing´

More likely to share:

Republicans

Older people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Can false or unsubstantiated beliefs about politics be corrected?

A

Article: 4 experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction è Investigating the extent to which corrective information embedded in realistic news reports succeeds in reducing prominent misperceptions about contemporary politics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

“Misperceptions

A

are cases in which people’sbeliefs about factual matters are not supported byclear evidence and expert opinion — a definitionthat includes both false and unsubstantiated beliefsabout the world.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Motivated reasoning

A

prevents people from updating their believes on false information. Citizens are likely to resist or reject arguments and evidence contradicting their opinions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Different components:

A

Prior Attitude Effect: perceiving evidence and arguments that support your views as stronger and more persuasive than those that challenge them.

Disconfirmation bias: people exert effort to counter-argue vigorously against evidence that is not congruent with their beliefs.

Confirmation bias/Selective exposure: searching/consuming congenial information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Backfire Effect

A

cognitive defensive mechanism/strategy. “…as a possible result of the process by which people counter argue preference-incongruent information and bolster their preexisting views.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Research Design:

A

4 experiments

People exposed to news stories on 3 controversial issues - Subjects split into 2 groups:

Group A: Misleading claim form a politician

Group B: Misleading claim + A correction

Compared how much Group B believed in misleading claimcompared to Group A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Results:

A

Experiment 1: Iraq War (WMD)
Correction worked for Liberals
Correction backfired for Conservatives
Experiment 2: Iraq War (WMD)
Opposite results for Conservatives: they did correct their WMD misperception
Backfired for conservatives that saw Iraq as an important issue
Experiment 3: Tax Cuts
Correction backfired for conservatives
Experiment 4: Stem Cell Research
Correction didn’t work for liberals
corrections frequently fail to reduce misperceptions among the targeted ideological group.

“Backfire effect” in which corrections actually increase misperceptions among the group in question.

Corrections actually strengthened misperceptions among the most strongly committed subjects

è Counter-attitudinal corrections seem to do poorly Is there really a backfire effect?

  • Not really there even in extreme conditions

Humans are goal-directed information processors who tend to evaluate information with a directional bias toward reinforcing their pre-existing views.

Ø Specifically, people tend to display bias in evaluating political arguments and evidence, favoring those that reinforce their existing views and disparaging those that contradict their views

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Conclusions

A

Artikel explains why: factual misperceptions about politics are so persistent

Corrective information in news reports may fail to reduce misperceptions and can sometimes increase them for the ideological group most likely to hold those misperceptions

Direct factual contradictions can actually strengthen ideologically grounded factual beliefs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Article pennycook et al

A

Fact-checkers don’t have the bandwidth to check all news storiesshared on social media
Not realistic to generate a ”True” or ”False” tag for all stories.

Potential issue: Implied Truth Effect -> People may believe that untagged fake news are true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Article pennycook fake-newswarning/tags a good solution?

A

2 experiments, where subjects:
Read news stories (some were true and some were false)
Rated whether they were true/false
Experiment 1
2 groups:
Control: news stories had no warnings
Treatment:
> 1/2 of the false stories had a warning•
> The remaining (1/2 of the false, and all the true) had no warning
– False headlines: no tag warning: Implied Truth Effect

Experiment 2

3 groups

Control: news stories had no warnings
- Warning treatment:
> 3/4 of the false stories had a warning

> The remaining (1/4 of the false, and all the true) had no warning Warning + Verification treatment:

3/4 of the false stories had a ”false” warning

3/4 of the true stories had a ”true” warning

Best condition: TRUE and FALSE warnings has the lowest sharing intentions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Artikel Pennycook & Rand (2020) Crowdsourcing fact-checking

A

Fact checkers are overloaded

Warnings/tags can work but we need to have ”false” as well as”true” tags to avoid theImplied Truth Effect

Can we speed up fact checking by leveraging the “Wisdom of the crowd”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Artikel Pennycook & Rand (2020) Crowdsourcing fact-checking Research design

A

Recruited 2,000 participants

They were asked to rate how much they trust:

20 mainstream media domains (i.e. NYTimes, FoxNews)

22 hyperpartisan sites (i.e. Breitbart, DailyKos)

18 fake news sites (i.e. onepoliticalplaza)

Recruited 8 people working for fact-checking agencies and asked them to provide ratings for the same sites

4 main challenges may undermine the crowdsourcing approach:

  1. Average person may not do a good job at assessing which news sitesare trustworthy
  2. Motivated reasoning (”Wisdom” v. “Collective bias” of the crowd)
  3. Conservatives may do worse: higher cognitive rigidity
  4. Average person may not be familiar with many outlets
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q
  1. Average person may not do a good job at assessing which news sitesare trustworthy -
A

In the aggregate, the ratings by regular people and fact checkers correlated >90%

All mainstream media outlets rated higher than hyperpartisan and fakenews sites.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q
  1. Motivated reasoning (”Wisdom” v. “Collective bias” of the crowd)
A

On average people rated opposing mainstream media higher than pro-attitudinal hyperpartisan/fake-new sites

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
  1. Conservatives may do worse: higher cognitive rigidity
A

Conservatives trust mainstream media less than Democrats (with the exception of FoxNews)

Correlations with ratings from professional fact-checkers:

Democrats: 92%; Republicans: 73%

17
Q
  1. Average person may not be familiar with many outlets
A

Overall, low familiarity with hyperpartisan and fake-news sites.

Lower correlation with fact-checkers when excluding ratings from those who are not familiar with the sources.

This suggests that people need to be familiar with an outlet in order to trust it.