Week 2 Flashcards

1
Q

Dark participation: 5 varients

A
  1. actors
  2. reasons
  3. objects/targets
  4. audiences
  5. process
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Social identity model of deindividuation effects

A
  • Factors anonymity into how people behave among the group
  • Personality identity subsumed to salient group identity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

social approval theory

A
  • Seeking approval form in-group
  • Sense of belonging
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Typology of harm

A
  1. Physical: injury
  2. emotional: emotional response
  3. relational: damage to reputation/relationship
  4. financial: material loss
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

typology of severity

A
  • punitive approach: degrees of harm a person caused
  • time-based approach: severity and danger to the victim
  • persistence approach: focus on symptoms and physical threat to victim
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

computational/machine based learning methods

A
  1. content based
  2. sentiment/emotion based
  3. user/profile based
  4. network based
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

6 methods to cope with online hate

A
  1. Technical coping: “block the offender”, “screenshot evidence”
  2. Assertiveness: “tell the offender to stop”
  3. Close support: “talk to friends about it”
  4. Helplessness/self-blame: “not know what to do”
  5. Retaliation: “get back against the offender”
  6. Distal advice: “call the police”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Health consequences of online hate

A
  • PTSDD
  • fear
  • anxiety
  • spiral of silence
  • depression
  • stress
    -sleep problems
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

professional consequences of online hate

A
  • loss of productivity
  • reputational damage
  • loss of confidence in work
  • stopping covering/researching extremist groups
  • self-censorship
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

4 dark personalities

A
  • Narcissism: high sense of entitlement
  • Psychopathy: disregard for and violation of others well-being, cold-hearted
  • Machiavellianism: cynical, cunning manipulators, exploitive of others
  • Sadism: motivated and enjoyed causing others harm
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

social approval theory: critique from Walther

A

“Traditional approaches to online hate tend to draw on assumptions that have little to do with contemporary social media”
“Such approaches do not particularly emphasize the social nature of social media, including interactivity among its participants, the potential effect of a large audience on the perpetrators mindsets”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

2 automated content moderation methods

A
  1. matching: new content is matched with existing content
  2. classification: assessing new content with no previous examples; machine learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Approaches of content moderation

A
  1. artisanal approach: small scale platforms; case-by-case
  2. community reliant approach: Site-wide governance by formal policies with community input; volunteer moderators
  3. industrial approach: large scale; formal policies; loss of context; automated moderation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

deplatforming hate results

A
  • Haters leave to platform but migrate somewhere else
  • For those who stayed on Reddit: expressed hate speech after banning
  • Communication around celebrities are reduced
  • Spread of offensive ideas associated to celebrities are reduced
  • Activity and toxicity of supporters are reduced
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

The big five personality traits model (1950 – 1990)

A
  1. openness
  2. conscientiousness
  3. extraversion/introversion
  4. agreeableness
  5. natural reactions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Troll definition

A

an actual person who intentionally initiates online conflict or offends other users to distract and sow divisions by posting inflammatory posts in an online community or a social network in order to provoke others into an emotional response or derail discussions

17
Q

Bot definition

A

an automated social media account run by an algorithm, rather than a real person, designed to make posts without human intervention

18
Q

Botnet definition

A

network of bot accounts managed by the same individual or group; designed to manufacture social media engagement to make a topic or a person appear to be more heavily engaged by ‘real’ users than is actually the case

19
Q

Youyou, Kosinkski & Stillwell (2015): Computer-based personality judgements are more accurate than those made by humans
findings

A
  • Computer predictions based on a generic digital footprint are more accurate than those made by the participants’ Facebook friends
  • Computer models show higher inter-judge agreement
  • Computer personality judgements have a higher external validity when predicting life outcomes
20
Q

The ‘internet of things’ definition

A

raw materials for big data platforms is supplied by expanding types of sensors and devices

21
Q

IoT technology definition

A

a network of ‘smart’, internet-enabled sensors and devices, able to communicate with each other and collect data about their use

22
Q

Risso, L. (2018): computational propaganda definition

A

The way in which the public can be manipulated on social networking applications.