Political Science Flashcards

1
Q

Content moderation

A

The organized practice of screening user-generated content posted to internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Network society

A

A social structure heavily influenced by interconnected digital networks, such as the internet, shaping communication, economy, and social interactions. An information-driven economy characterized by the compression of time and space into a “space of flows”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Knowledge labor

A

Involves jobs that rely heavily on intellectual skills

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Immaterial knowledge labour

A

Bridges direct and indirect knowledge work. Work that produces intangible results, often in the form of information, ideas, or digital content

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

”Duty of Care”

A

A new legal mechanism in TERREG. Positions social media platforms as benevolent security actors – they are pushed to commit themselves to protect the integrity of their services, in a way that is aligned (or at least not opposed) to the priorities of public authorities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Co-production

A

(Public-private collaboration). Public authorities and private platforms work together to address the challenges of e.g online terrorist content. It emphasizes the complex relationship and interaction between public and private actors in the production of security decisions, such as referral, removal, flagging, and filtering of online content. Co-production does not imply seamless collaboration. The concept of co-production can help map exactly how data are shared, stored and removed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Referral

A

The process where public authorities, such as the EU Internet Referral Unit (IRU), report content that violates platforms’ community guidelines to the platforms for removal. It allows public authorities to intervene in content moderation by influencing how companies handle content or by pushing platforms to remove content brought to their attention. One on four key components of EU-directed content moderation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Removal

A

The process of taking down or disabling access to online content that is deemed to be in violation of platforms’ terms of service or national legislation. It involves the actual elimination of the content from the platform. One on four key components of EU-directed content moderation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Flagging

A

Involves the use of large-scale algorithmic systems and human to identify digital objects that may contain terrorist content or other inappropriate content. Once flagged, the content is prioritized for review by human content moderators. This process is crucial in identifying and addressing potentially harmful content. One on four key components of EU-directed content moderation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Filtering

A

The use of automated tools to identify and remove digital objects that violate the platforms’ terms of service. It involves the creation of mechanisms to prevent the publication of previously removed content or content deemed potentially terrorist-related or inappropriate by machine learning algorithms. One on four key components of EU-directed content moderation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

European security integration

A

The process of coordinating and aligning security practices and policies across European countries. It involves collaboration and cooperation between European Union member states to address security challenges collectively

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Public-private cooperation

A

The collaboration and interaction between public authorities and private entities, such as social media platforms, in addressing security challenges. It involves joint efforts and shared responsibilities in security-related activities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Networked security

A

The interconnected and collaborative approach to security, where various actors, including public authorities, private companies, and international organizations, work together to address security threats.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The role of platforms in European security

A

Platforms, such as social media and online service providers, play a significant role in European security by being responsible for content moderation, implementing security measures, and collaborating with public authorities to address security challenges.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Private Terms of Service (ToS)

A

The agreements set by social media companies that serve as the basis for their content moderation decisions. These agreements have quasi-legal force and are often referred to as community guidelines.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

National Legislation

A

The laws and regulations of individual countries that may provide the legal basis for ordering platforms to remove online content. It can be used to deem online content illegal irrespective of the platforms’ terms of service.

17
Q

Human-Machine Interaction Interaction

A

The collaboration between human reviewers and machine learning systems in the process of content moderation. It involves the use of technology, such as algorithms and automated tools, alongside human expertise to make decisions on the identification and removal of online content

18
Q

Trusted flagger

A

Reliable users can be used as trusted flaggers and when they report, action will be taken more quickly

19
Q

Shadowbanning

A

Content is still uploaded (not taken away) but some messages are practically invisible. Downgraded to such an extent that they might as well not be there

20
Q

A new ‘Digital Constitutionalism’

A

Platforms use the language and forms of constitutionalism in their user agreements and Terms of Service. They are companies themselves making institutions. E.g Platforms like Facebook have written their terms in the style of a constitution