Lecture 4: social media and online hate Flashcards

1
Q

What could belong to online hate?

A
  • Attacks against groups, discrimination, prejucide
  • Toxicity behaviours
  • Incivility
  • Cyberbullying, sexual harassment
  • Trolling, flaming, griefing
  • Social deviance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition ‘cyberhate’

A

“The use of electronic technology to spread bigoted, discrimanotry, terrorist and extremist information, manifests itself on websites and blogs, as well as in chat rooms, social media, comment sections and gaming. In shot, hate is present in many forms on the internet, creating a hostile environoment and reducing equal access to its benefits for those targeted by hatred and intimidation”.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do people theorize online harm?

A
  • Rumors or harmful words offline (48%)
  • Unique to social media/internet (46%)
    - Enabled by public nature of social media
    - Privacy violation
  • Attitudes, beliefs and behaviours causing online harm
  • Algorithms as being responsible
  • Constant access to other people’s lives comparing oneself to others
  • Normalizing of harmful content
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Typology of harm

A
  • Physical: Bodily injury, including self-injury, sexual abuse or death
  • Emotional: from annoyance to traumatic emotional response
  • Relational: Damage to one’ reputation or relationships (personal, professional or community
  • Financial: Material or financial loss (e.g., scams, extortion, account loss, blackmail, etc.)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Typology of severity

A
  • Punitive approach: Judicial perspective, degrees of harm a person caused
  • Time-based approach: law enforcement (i.e. police) perspective, prioritizing “severity” and danger to victim
  • Persistence approach: mental health professional perspective, focus on the symptoms and physical threat pose to victim and others
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Dimensions of severity

A
  • Perspective (actor, viewer, target)
  • Perceived intent (low to high)
  • Agency of target (‘do you have a choice’)
  • Experience assessing harm (relating from own life)
  • Scale of harm (number of people impacted)
  • Urgency to adress harm
  • Vulnerablility
  • Medium
  • Sphere the harm occurs in
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do we detect or measure online hate?

A
  • Interviews and focus groups
  • Survey questionnaires
  • Content analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Computational / machine learning methods

A
  • Content-based - identifying offensive language
  • Sentiment and emotion-based, identififying emotional language
  • User or profile-based - identifying user demographics and user’s profile information
  • Network-based - identifying user’s association with other users
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Coping with online hate

A
  • Technical coping
  • Assertiveness (tell them to stop)
  • Close support (talk to friends)
  • Helplessness/self-blame (not knowing what to do)
  • Retaliation (wraak)
  • Distal advice (call police)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Health consequences

A
  • Fear
  • Spiral of silence
  • Anxiety
  • Depression
  • PTSS
  • Sleep problems
  • Stress
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Professional consequences of online hate

A
  • Loss of productivity
  • Reputational damage
  • Loss of confidence in work
  • Stopping covering/research on extremist groups
  • Self-censorship
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Who do people consider as perpetrators of online harm? (actors of dark participation)

A
  • Individuals (64%) (Groups, influencers or media)
  • Social media platforms (20 students)
  • Society (16 students)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What groups spread online hate?

A
  • Far right extremists
  • Conspiracy theorists
  • Terrorists
  • Misogynistic groups aka incels (vrouwonvriendelijk)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Reasons for dark participation (Causal actors)

A
  • Mental health issues
  • Personality
  • Being bullied
  • From interpersonal conflicts or breakups
  • Boredom
  • Loneliness or isolation
  • Anonymity
  • Popularity or dominance
  • Accessibility
  • Lack of feedback from victim
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Dark personalities (socially undesirable)

A
  • Narcissism
  • Psychopathy
  • Machiavellianism: Cynical, cunning manipulators, exploitive of others
  • Sadism
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Moral disengagement

A
  • Social cognitive theory of morality by Albert Bandura
  • aka: Justifying own unethical or immoral behavior
  • Cognitive processes in reducing guilt and remorse
  • Rationalization to justify harm
    - Victim blame: “It’s their fault”
    - Dehumanization: “they deserve it”
    - Minimization: “it happens what can you do?”
    - Diffusion of responsibility: “Not my problem”
17
Q

Definition moral grandstanding

A

Suggesting that individuals post potentially polarizing ideological statements not so much to persuade others of one’s position, but rather, as a form of value-signaling that is ‘direclty or indirectly concerned with increasing one’s influence, rank or social standing’