week 2 Flashcards
definition cyberhate
the use of electronic technology to spread bigoted, discriminatory, terrorist and extremist information, manifest on website and blogs.
- It is creating a hostile environment and reducing equal access to its benefits for those targeted by hatred an intimidation.
the 5 parts of dark participation
- actors
- reasons
- objects/targets
- audiences
- processes
Actors of dark participation
individuals
groups
influencers
media
reasons of dark participation
authentic evil: evil attacks: personal hate, pleasure (on individual)
tactical: more controlled and planned
strategic: large scale manipulation campaigns
physical appearance
political view
gender
Objects/targets of dark participation
religious groups
minority ethnic groups
gender groups
political groups
mostly 18-39 years old
- processes of dark participation
unstructured
structured (situational)
systematic (long term)
social approval theory (not from walter, but the slides)
- seeking approval from ingroup members
- collaboration with ingroup members
- sense of belonging
- audience of online hate is ingroup targeting victims
4 types of harm
- physical: bodily injury, sexual abuse
- emotional: from annoyance to traumatic emotional response
- relational: damage to one’s reputation or relationships
- financial: material or financial loss
typology of severity
- punitive approach: judicial perspective: degree of harm a peer caused
- time-based approach: law enforcement perspective: prioritizing severity and danger to victim
- persistence approach: mental health professional perspective: focus on the symptoms and physical threat pose to victim and others
dimensions of severity
- perspective (actor, viewer, target)
- perceived intent (low-high)
- agency of target (do you have a choice)
- experience assessing harm (relating from own life)
- scale of harm (numer of people impacted)
- urgency to address harm (time sensitivity)
- vulnerability (risk posed to certain groups)
- medium (live video, audio, image)
- sphere the harm occurs in (degree of privacy of the harm)
ezelsbruggetje: ppaesuvms
how to cope with online hate
- technical coping (block the offender)
- assertiveness (tell the offender to stop)
- close support (talk to friends)
- helplessness (idk)
- retaliation (get back against offender)
- distal advice (call police)
health vs professional consequences of online harm
Health consequences: fear, spiral of silence, anxiety, depression, stress…
Professional consequences: loss of productivity, reputational damage, loss of confidence, stopping covering, self-censorship
walter about social approval theory
People generate hate messages online primaril to accrue signals of admiration and praise from sympathetic online peers and to make friends
hypotheses and results article Frischlich et al: how personality, media use and online experience shape uncivil participation
RQ: dark personality traits (political attitudes and emotions, media use, users’ experience (civil or uncivil)) leading to own uncivil behavior
Results: 46% who witnessed incivility also engaged in uncivil participation
- high civil comments and hate speech gave a strong prediction of uncivil participation
- strongest predictor: personal experience with online victimization
- RQ confirmed (overall)
definition dark participation
It is characterized by negative, selfish or even deeply sinister contributions to the news-making processes (such as “trolling”)
what solutions do people have of online hate
- Solving up to individual (54%)
- posting or consuming content - Social media companies should fix it (36%)
- using algorithmic filters
- human moderators
- tracking or blocking harmful content
risks of removing anonymity
- victims of online harassment know who their attacker is (+)
- conflict, harassment and discrimnation are social and cultural problems
- revealing info exposes people to greater levels of harassment (-)
- companies storing personal unfo for other purposes has risks (-)
- identity protection needed (-)
- people mange identity across different social contexts
- social norms reduce problems
- people reveal identity to increase influence and approval
- hate groups operat openly to seek legitimacy (-)
ezelsbruggetje: vcrcipsph
differences between automated content moderation and human moderation
automated content moderation:
- removes problematic content without direct, consistent human intervention (goes automatic)
- example: chatGPT
human moderation
- moderation of tweets by humans
- leads to dissatisfaction & emotional exhaustion
2 types of automated content moderation
- matching: new content is matched with existing content
- classification: assessing new content with no previous examples (machine learning)
definition content moderation
the detection of, assessment of, and interventions taken in content or behavior deemed unacceptable by platforms or other info intermediaries
- including the rules they impose, the human labour and technologies required, and the institutional mechanisms of adjudication, enforcement and appeal that support it
3 types of approaches to content moderation and their explanation
- Artisanal approach (small platforms)
- case-by-case governance
- limited use of automated content moderation
- greater time available on moderation per report - Community-reliant approach (wikipedia, reddit)
- federal system governance
- site-wide governance by formal policies with community (viewers) input
- relationships between moderator volunteer and platform administrator (balance of power) - Industrial approach (facebook, google, twitter)
- formal policies created by policy team
- mass moderation operation (high consistent enforcement, loss of context)
- high use of automated content moderation
safe space vs. freedom speech
snap ik niet helemaal ;) 2 voorbeelden met experimenten
certain type of power-shift of media over the last decades
mainstream media and political insitutions los most of their power –> unmediated populist nationalism for the internet-age
explanation “knowledge, info and data triangle”
- knowledge: important, but there’s little of (experience, insight, understanding and contextualized info)
- info: in the middle (contextualized, categorized, calculated and condensed data)
- data: a lot of, easy to get (facts and figures which relay something specific, but which are not organized in any way)
how do you connect knowledge to data
people are data (phones, online purchases, smart tv’s) → mass proliferation on consumer demographics, psychographics, behavior etc. → useful for advertising
- keeps on extracting and monitoring
- high uniqueness of human mobility traces (keeps increasing with high quality data-points
how does detailed audience segmentation work?
advanced techniques of audience-segmentation allows for the population to be divided into ever smaller groups of the basis of granular info about demographics, psychographics, behavior and attitudes
- you’re on the VU→ high educated student
- connecting this with motivations, personal characteristics, mindsets etc.
Persily about political (trump) campaigns ( 3 components of trump’s digital campaign operation)
There was a late start with the trump campaign, mainly focus on social media, yet live rally’s made the campaign feel authentic
3 principal components of trump’s digital campaign operation
1. marketing agency giles-parscale
2. microtargeting by firm cambridge analytic
- they developed detailed psychological profiles of their target audience
- more data-points → higher predictive power of its algorithms
- very innovative by combining largest collection of data point in the world with incredibly accurate psychometric profiles of users
3. republican party ‘s digital team
big five personality traits
measures five key dimensions of people’s personalities
1. openness
2. conscientiousness
3. extraversion/introversion
4. agreeableness
5. neuroticism
–> all rated from low to high
definition computational propaganda
- manipulation of public opinion via range of social networking platforms and devices
- Involves both algorithmic distribution and human curation
- bots and trolls working together
definition bot & troll & botnet
bot = automated social media account run by an algorithmic person, designed to make posts without human intervention
trol = an actual person who intentionally initiates online conflict or offends other users to distract and sow divisions by posting inflammatory posts in an online community or a social network in order to provoke others into an emotional response or derail discussion
botnet = network of bot accounts managed by the same individual or group
findings article youyou: computer based personality judgements are more accurate than those made by humans
Findings:
- computer predictions based on a generic digital footprint are more accurate than those made by participants facebook friends
- computer models show higher interjudge agreement
- computers have higher external validity when predicting life outcomes even outperforming some self-rated personality score
- high on the big five personality traits –> easier to predict
Conclusion: computers outpacing humans in personality judgement presents significant opportunities and challenges in psychological assessment, marketing, politics and privacy
results experiment extraversion and openness with ads
introversion → introverted audience, low openness → lows openness ad (the same for extraversion and high openness)
- so: people like ads that fit wel with their personality
- psychological mass persuasion can help people make better decisions, lead healthier and happier lives.
morozov about the problems of democratic participation
we need to ditch cyber-utopianism and internet centrism, we start with a flawed set of assumption and act on them using a flawed methodology. This result is called Net Delusion. So extreme that it is posed to have global consequences that risk undermining the very project of promoting democracy.
why is the dark side of the net so difficult to notice?
utilitarian perspective: the Net is merely a neutral tool and can be used for both good and bad
ecological perspective: the Net is more than a tool: it transforms the political environment and the people who participate in politics. it enlarges or creates a new public sphere
How will authoritarians supply the capabilities of the Net and develope new repressive capabilities?
- physical: mobs or fake protests
- pressure social media and telephone companies to block service or keep servers in the country
- control of online resources/platform control
- cyber-attacks
- propaganda in digital forms
- active use of twitter by authoritarian power-holders and pro-government forces
- new forms of filtering and customized sensory
- delegating censorship to private companies
- use of new forms of technological surveillance (monitoring, mobile tracking, data-mining, social graph analysis)
ezelsbruggetje: ppccp andu