week 2 Flashcards

1
Q

definition cyberhate

A

the use of electronic technology to spread bigoted, discriminatory, terrorist and extremist information, manifest on website and blogs.
- It is creating a hostile environment and reducing equal access to its benefits for those targeted by hatred an intimidation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

the 5 parts of dark participation

A
  1. actors
  2. reasons
  3. objects/targets
  4. audiences
  5. processes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Actors of dark participation

A

individuals
groups
influencers
media

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

reasons of dark participation

A

authentic evil: evil attacks: personal hate, pleasure (on individual)
tactical: more controlled and planned
strategic: large scale manipulation campaigns

physical appearance
political view
gender

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Objects/targets of dark participation

A

religious groups
minority ethnic groups
gender groups
political groups
mostly 18-39 years old

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  1. processes of dark participation
A

unstructured
structured (situational)
systematic (long term)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

social approval theory (not from walter, but the slides)

A
  • seeking approval from ingroup members
  • collaboration with ingroup members
  • sense of belonging
  • audience of online hate is ingroup targeting victims
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

4 types of harm

A
  1. physical: bodily injury, sexual abuse
  2. emotional: from annoyance to traumatic emotional response
  3. relational: damage to one’s reputation or relationships
  4. financial: material or financial loss
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

typology of severity

A
  1. punitive approach: judicial perspective: degree of harm a peer caused
  2. time-based approach: law enforcement perspective: prioritizing severity and danger to victim
  3. persistence approach: mental health professional perspective: focus on the symptoms and physical threat pose to victim and others
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

dimensions of severity

A
  1. perspective (actor, viewer, target)
  2. perceived intent (low-high)
  3. agency of target (do you have a choice)
  4. experience assessing harm (relating from own life)
  5. scale of harm (numer of people impacted)
  6. urgency to address harm (time sensitivity)
  7. vulnerability (risk posed to certain groups)
  8. medium (live video, audio, image)
  9. sphere the harm occurs in (degree of privacy of the harm)

ezelsbruggetje: ppaesuvms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how to cope with online hate

A
  1. technical coping (block the offender)
  2. assertiveness (tell the offender to stop)
  3. close support (talk to friends)
  4. helplessness (idk)
  5. retaliation (get back against offender)
  6. distal advice (call police)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

health vs professional consequences of online harm

A

Health consequences: fear, spiral of silence, anxiety, depression, stress…
Professional consequences: loss of productivity, reputational damage, loss of confidence, stopping covering, self-censorship

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

walter about social approval theory

A

People generate hate messages online primaril to accrue signals of admiration and praise from sympathetic online peers and to make friends

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

hypotheses and results article Frischlich et al: how personality, media use and online experience shape uncivil participation

A

RQ: dark personality traits (political attitudes and emotions, media use, users’ experience (civil or uncivil)) leading to own uncivil behavior

Results: 46% who witnessed incivility also engaged in uncivil participation
- high civil comments and hate speech gave a strong prediction of uncivil participation
- strongest predictor: personal experience with online victimization
- RQ confirmed (overall)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

definition dark participation

A

It is characterized by negative, selfish or even deeply sinister contributions to the news-making processes (such as “trolling”)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what solutions do people have of online hate

A
  1. Solving up to individual (54%)
    - posting or consuming content
  2. Social media companies should fix it (36%)
    - using algorithmic filters
    - human moderators
    - tracking or blocking harmful content
17
Q

risks of removing anonymity

A
  1. victims of online harassment know who their attacker is (+)
  2. conflict, harassment and discrimnation are social and cultural problems
  3. revealing info exposes people to greater levels of harassment (-)
  4. companies storing personal unfo for other purposes has risks (-)
  5. identity protection needed (-)
  6. people mange identity across different social contexts
  7. social norms reduce problems
  8. people reveal identity to increase influence and approval
  9. hate groups operat openly to seek legitimacy (-)

ezelsbruggetje: vcrcipsph

18
Q

differences between automated content moderation and human moderation

A

automated content moderation:
- removes problematic content without direct, consistent human intervention (goes automatic)
- example: chatGPT
human moderation
- moderation of tweets by humans
- leads to dissatisfaction & emotional exhaustion

19
Q

2 types of automated content moderation

A
  1. matching: new content is matched with existing content
  2. classification: assessing new content with no previous examples (machine learning)
20
Q

definition content moderation

A

the detection of, assessment of, and interventions taken in content or behavior deemed unacceptable by platforms or other info intermediaries
- including the rules they impose, the human labour and technologies required, and the institutional mechanisms of adjudication, enforcement and appeal that support it

21
Q

3 types of approaches to content moderation and their explanation

A
  1. Artisanal approach (small platforms)
    - case-by-case governance
    - limited use of automated content moderation
    - greater time available on moderation per report
  2. Community-reliant approach (wikipedia, reddit)
    - federal system governance
    - site-wide governance by formal policies with community (viewers) input
    - relationships between moderator volunteer and platform administrator (balance of power)
  3. Industrial approach (facebook, google, twitter)
    - formal policies created by policy team
    - mass moderation operation (high consistent enforcement, loss of context)
    - high use of automated content moderation
22
Q

safe space vs. freedom speech

A

snap ik niet helemaal ;) 2 voorbeelden met experimenten

23
Q

certain type of power-shift of media over the last decades

A

mainstream media and political insitutions los most of their power –> unmediated populist nationalism for the internet-age

24
Q

explanation “knowledge, info and data triangle”

A
  1. knowledge: important, but there’s little of (experience, insight, understanding and contextualized info)
  2. info: in the middle (contextualized, categorized, calculated and condensed data)
  3. data: a lot of, easy to get (facts and figures which relay something specific, but which are not organized in any way)
25
Q

how do you connect knowledge to data

A

people are data (phones, online purchases, smart tv’s) → mass proliferation on consumer demographics, psychographics, behavior etc. → useful for advertising
- keeps on extracting and monitoring
- high uniqueness of human mobility traces (keeps increasing with high quality data-points

26
Q

how does detailed audience segmentation work?

A

advanced techniques of audience-segmentation allows for the population to be divided into ever smaller groups of the basis of granular info about demographics, psychographics, behavior and attitudes
- you’re on the VU→ high educated student
- connecting this with motivations, personal characteristics, mindsets etc.

27
Q

Persily about political (trump) campaigns ( 3 components of trump’s digital campaign operation)

A

There was a late start with the trump campaign, mainly focus on social media, yet live rally’s made the campaign feel authentic

3 principal components of trump’s digital campaign operation
1. marketing agency giles-parscale
2. microtargeting by firm cambridge analytic
- they developed detailed psychological profiles of their target audience
- more data-points → higher predictive power of its algorithms
- very innovative by combining largest collection of data point in the world with incredibly accurate psychometric profiles of users
3. republican party ‘s digital team

28
Q

big five personality traits

A

measures five key dimensions of people’s personalities
1. openness
2. conscientiousness
3. extraversion/introversion
4. agreeableness
5. neuroticism
–> all rated from low to high

29
Q

definition computational propaganda

A
  • manipulation of public opinion via range of social networking platforms and devices
  • Involves both algorithmic distribution and human curation
  • bots and trolls working together
30
Q

definition bot & troll & botnet

A

bot = automated social media account run by an algorithmic person, designed to make posts without human intervention

trol = an actual person who intentionally initiates online conflict or offends other users to distract and sow divisions by posting inflammatory posts in an online community or a social network in order to provoke others into an emotional response or derail discussion

botnet = network of bot accounts managed by the same individual or group

31
Q

findings article youyou: computer based personality judgements are more accurate than those made by humans

A

Findings:
- computer predictions based on a generic digital footprint are more accurate than those made by participants facebook friends
- computer models show higher interjudge agreement
- computers have higher external validity when predicting life outcomes even outperforming some self-rated personality score
- high on the big five personality traits –> easier to predict

Conclusion: computers outpacing humans in personality judgement presents significant opportunities and challenges in psychological assessment, marketing, politics and privacy

32
Q

results experiment extraversion and openness with ads

A

introversion → introverted audience, low openness → lows openness ad (the same for extraversion and high openness)
- so: people like ads that fit wel with their personality
- psychological mass persuasion can help people make better decisions, lead healthier and happier lives.

33
Q

morozov about the problems of democratic participation

A

we need to ditch cyber-utopianism and internet centrism, we start with a flawed set of assumption and act on them using a flawed methodology. This result is called Net Delusion. So extreme that it is posed to have global consequences that risk undermining the very project of promoting democracy.

34
Q

why is the dark side of the net so difficult to notice?

A

utilitarian perspective: the Net is merely a neutral tool and can be used for both good and bad

ecological perspective: the Net is more than a tool: it transforms the political environment and the people who participate in politics. it enlarges or creates a new public sphere

35
Q

How will authoritarians supply the capabilities of the Net and develope new repressive capabilities?

A
  • physical: mobs or fake protests
  • pressure social media and telephone companies to block service or keep servers in the country
  • control of online resources/platform control
  • cyber-attacks
  • propaganda in digital forms
  • active use of twitter by authoritarian power-holders and pro-government forces
  • new forms of filtering and customized sensory
  • delegating censorship to private companies
  • use of new forms of technological surveillance (monitoring, mobile tracking, data-mining, social graph analysis)

ezelsbruggetje: ppccp andu