LECTURE 6 - ‘DEMOCRATIC BACKLASH OF THE DIGITAL REVOLUTION’ Flashcards

1
Q

Characteristics of data - Decades ago

A
  • Census data,
  • market research
  • Data about neighborhoods & buildings
  • Household level data
  • Individual level data
  • Unique identifiers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Characteristics of data - Recent years

A
  • Aggregated anonymized “big data”
  • Metadata about websites, apps, content places
  • Browser or device specific data
  • Individual level data
  • Unique identifiers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The most valuable resource: data?

A

Data → information → knowledge
- Data needs structure and context to create meaning
- Data without insights is worse than useless

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are tracing, tracking and identification technologies and how is it collected?

A
  • Mass proliferation of data on consumer demographics, psychographics, behaviour and attitudes, including health and location data.
    → Collection through smartphones and the growth of internet enabled devices.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What can be done with advanced data analysis?

A
  • It enables (political) marketing to transform ‘area’ or ‘platform-based’ campaigns to campaigns which target individual users across multiple devices and locations.
  • In other words: tracking –> individual targeting across platforms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the real data revolution?

A
  • Continuous extraction and monitoring of individuals
  • Having more insights in approximate locations of individuals → Reconstruct individuals’ movements across space and time.
  • Acces to mobility data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Human mobility traces are highly unique (Montoye et. al 2013) Results

A
  • Analysis of 15 months human mobility data
  • Fourspatio-temporal points are enough to uniquely identify 96% of the individuals
  • Human mobility traces are highly unique
  • Hoe groter de temporal en spatial dimensions, hoe minder uniek de mobility traces
  • Uniqueness increases with higher quality data points
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Detailed audience segmentation

A
  • Allows the population to be divided into ever smaller groups based on granular information about demographics, psychographics, behaviours and attitudes.
  • Micro-targeting of messages tailored to personal characteristics, mindset, emotional state, lifestyle and other characteristics.
  • Even if these messages are not persuasive for people to change their mind, it may encourage or discourage them to vote.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

To what level of an individual do you need to comply when campaigning?

A
  • The emotional level of the person
    → Politics is connecting to sentiments and motivations. Politics is conquering the hearts and minds.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The Big Five Personality Traits model (1950s – 1990s)

A
  • Measures five key dimensions of people’s personalities:
  • Openness (Intellect or imagination): measures level of creativity and desire for knowledge and new experiences.
  • Conscientiousness: level of care people takes in life and work. High scores indicate you are organized and thorough and know how to make plans and follow them through. Low scores indicate you are likely be lax and disorganized.
  • Extraversion/introversion measures your level of sociability. Outgoing versus quiet. Some draw energy from a crowd, while others find it difficult to work and communicate with other people.
  • Agreeableness: measures how well you get on with other people. Considerate, helpful and willing to compromise versus rigidity and a tendency to put our own needs before others.
  • Natural reactions (Emotional stability or neuroticism): measures emotional reactions. Are you negative or calm in times of adversity? Are you obsessively worried about small details or are you relaxed in stressful situations.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Computational Propaganda (Risso, 2018)

A
  • Pace in which lies and fake news travels in online world through different strategies and technologies such as automated social-media bots.
  • → Manipulation of public opinion via range of social networking platforms and devices. (AKA Cambridge Analytica)
  • Involves both algorithmic distribution and human curation → bots and trolls working together.
  • Psychographic database allows companies like Cambridge Analytica to develop communication programmes and electoral campaigsn that triggers inner fears and exploit deep-rooted bias.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Bots, trolls and botnets

A

Bots and trolls working together.

  • Bots = Software program that performs automated, repetitive, pre-defined tasks. An automated social media account run by an algorithm, rather than a real person, deigned to make posts without human intervention.
  • Troll = An actual person who intentionally initiates online conflict or offends other users to distract and sow divisions by posting inflammatory posts in an online communicate or a social network in order to provoke others into an emotional response or derail discussions.
  • Botnet = Network of bot accounts managed by the same individual or group. Designed to manufacture social media engagement to make a topic or person appear to be more heavily engaged by “real” users than is actually the case.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

effect on democracy and freedom

A

▪ Traditional media lost its role as ‘gate-keeper.’
▪ Agenda-setting and issue ownership
▪ Fake news as satire, for profit, political propaganda, and reckless reporting.
▪ Issue-ownership = If voters associate parties with issues → Naming that party let you think
immediately about the issue and the other way around.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Campaign bots

A
  • Automatically generate messages that misinform voters, while the identity of the generator remains undisclosed (no accountability) and usually undiscoverable (shady outsiders and foreign actors)
  • Final break-down in established campaigning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Comparing the accuracy of personality judgement of computer models vs. humans
What do they do?

A

The people’s personalities can be predicted automatically and without involving human social-cognitive skills:
computers’ personality judgements based digital footprints are more accurate and valid than judgements made by close acquaintances (friends, family, spouse, colleagues, etc.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Study compares the accuracy of personality judgement between computer models and humans

A

Using several criteria: computers’ judgements of people’s personalities based on their digital footprints
are more accurate and valid than judgements made by their close others or acquaintances (friends ,
family, spouse, colleagues, etc.)
→ This highlights that people’s personalities can be predicted automatically and without involving human socialcognitive skills.

17
Q

Psychological targeting:

A

Makes it possible to influence the behavior of large groups of people by tailoring persuasive appeals to the psychological needs of the target audiences.

18
Q

Results of 3 studies on psychological targeting

A

→ evidence the effectiveness of psychological targeting in the context of real-life digital mass persuasion.
- Tailored persuasive appeals aligned with psychological profiles of large groups of people can influence actual behavior and choices.
- Psychological mass persuasion can help people make better decisions, lead healthier and happier lives.
- It can also be used to covertly exploit weaknesses in character and persuade people to take action against their own best interest, highlighting the potential need for policy interventions.
- Facebook → already offers ad ‘optimization’, using its data to predict clicks on ads or other online actions which already segments users by many more dimensions than do models dutch as the Big Five.

19
Q

‘the internet of things’ (IoT)

A
  • Raw materials for big data platforms are supplied by expanding types of sensors and devices
  • Interconnected network of physical devices, vehicles, appliances, and other objects that are embedded with sensors, software, and network connectivity, enabling them to exchange data and communicate with each other over the internet.
20
Q

IoT technology

A
  • A network of ‘smart’, internet enabled sensors and devices, able to communicate with each other and collect data about their use.
21
Q

conflict between democracy and dictatorship

A

The conflict between democracy and dictatorship is NOT between two different ethical systems, but between
two different data-processing systems.
→ Democracy distributes the power to process information and make decisions among many people
and institutions, whereas dictatorship concentrates information and power in one place.

22
Q

Cyber-utopia versus cyber-dystopia
Morozov: the dark side of internet freedom
summary

A

To salvage the internet’s promise to aid the fight against authoritarianism, those of us in the West who still care about the future of democracy will need to ditch both cyber-utopianism and Internet centrism. Currently, we
start with a flawed set of assumptions (cyber-utopianism) and act on them using a flawed, even crippled, methodology (internet-centrism). The result is what I call the Net Delusion. Pushed to have extreme, such logic is poised to have significant global consequences that may risk undermining the very project of promoting democracy. It’s a folly that the West could do without.

23
Q

Morozovs point of view

A
  • Morozov argues that problems of democratic participation do not have a ‘technological fix’.
  • Morozov criticism is NOT that the Net is unimportant in democracy or to deny that the Net can enhance
    democracy, undermine authoritarianism, not that the ‘dark forces always win.
  • The Net is not inherently democratic: Preserving democratic freedoms and realizing liberating potential of the Net will be difficult.
    → The current business structure of the Net makes it more useful to dictators in authoritarian regimes than to
    democratic dissidents and protesters.
24
Q

Why is the dark side of the Net so difficult to notice?
utilitarian perspective

A

The Net is merely a neutral tool and can be used for both good and bad

25
Q

Why is the dark side of the Net so difficult to notice?
ecological perspective

A

The Net is more than a tool → It transforms the political environment and the people who participate in politics. It enlarges or creates a new (digital/networked) public sphere

26
Q

How will authoritarians suppress the liberating and emancipatory capabilities of the Net and developing new or enhance existing repressive capabilities?

A
  • Physical: mobs or fake protests
  • Pressure social media and telephone companies to block service or keep servers in the country.
  • Control of online resources/ platform control
  • Cyber-attacks
  • Propaganda in digital forms
  • Active us of Twitter by authoritarian powerholders and pro-government forces
  • New forms of filtering and customized censorship
  • Delegating censorship to private companies
  • New forms of technological suveillance
  • Mobile tracking (identification of protesters through mobile phone)
  • Data-mining
  • Social graph anlyisis (pre-empting who will protest)
27
Q

Deep neural networks are more accurate than humans at detecting sexual orientation from facial images (Wang.y & Kosinksi M (2017). Results

A
  • Study of 300.000 images downloaded from a public profile dating app
  • Using basic facial-detection technology, unique ‘faceprints’ were created
  • Correlations between facial features and sexuality were detected
  • Face analyser did this better dan humans distuingishing gay and straight faces
  • 5 photos: 91% accuracy in males (71% for woman and 83% after 5.
  • Human ability to tell gay from straight = 61% for men and 54% for woman
28
Q

Trump campain - Cambridge analytica in USA 2016 (Persily, 2017)

A
  • 3 principal components of Trump’s digital campaign operation
  • Marketing agency Giles-Parscale –> did design for advertisements and website
  • Microtargeting firm Cambridge Analytica –> Pro-Trump / Stole 87 million data to target them with political ads
  • Republican Party’s digital team
  • Micro-targeting: 4000 different campaign ads
29
Q

Aleksandr Kogan

A
  • Involved in the Cambridge Analytica scandal
  • Used academic data for commercial purposes
  • Created an app marked as personality quiz –> users were asked to provide access to FB
  • Data was sold to Analytica to create detailed psychological profiles of voters and target them with ads
30
Q

Computer-based personality judgements are more accurate than those made by humans (Youyou, Kosinki, Stillwel 2015) results

A
  • 86,220 respondents, 100-item personality questionnaire
  • Computer predictions (FB likes) are more accurate than those made by humans
  • Computer models show higher inter-judge agreement (judgees that agree with each other are more likely to be accurate)
  • Computer models have higher external validity
  • Overal: Based on FB likes, an algorithm can better predict personality traits than people who know the person
  • Hoe meer likes er beschikbaar zijn, hoe beter de predictability
31
Q

Cyber utopian definition

A

Someone who believes that technology, particularly the internet and digital technologies, can solve many of the world’s problems and create a better future for humanity.

32
Q

Trump campaign - Twitter results

A
  • Bots produced about 20% of all tweets
  • Pro-Trump Twitter bots created four times as many tweets as pro-Clinton bots
  • Trumps’ tweets were re-tweeted more than three times as often than Hillary, while his FB posts were re-shared five times more
33
Q

Political ‘micro-targeting’

A

Profiling and segmenting voters so that political messages can be adapted to their psychological traits

34
Q

Political ‘micro-targeting’

A

Profiling and segmenting voters so that political messages can be adapted to their psychological traits