Political Science Flashcards
Content moderation
The organized practice of screening user-generated content posted to internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction
Network society
A social structure heavily influenced by interconnected digital networks, such as the internet, shaping communication, economy, and social interactions. An information-driven economy characterized by the compression of time and space into a “space of flows”
Knowledge labor
Involves jobs that rely heavily on intellectual skills
Immaterial knowledge labour
Bridges direct and indirect knowledge work. Work that produces intangible results, often in the form of information, ideas, or digital content
”Duty of Care”
A new legal mechanism in TERREG. Positions social media platforms as benevolent security actors – they are pushed to commit themselves to protect the integrity of their services, in a way that is aligned (or at least not opposed) to the priorities of public authorities.
Co-production
(Public-private collaboration). Public authorities and private platforms work together to address the challenges of e.g online terrorist content. It emphasizes the complex relationship and interaction between public and private actors in the production of security decisions, such as referral, removal, flagging, and filtering of online content. Co-production does not imply seamless collaboration. The concept of co-production can help map exactly how data are shared, stored and removed.
Referral
The process where public authorities, such as the EU Internet Referral Unit (IRU), report content that violates platforms’ community guidelines to the platforms for removal. It allows public authorities to intervene in content moderation by influencing how companies handle content or by pushing platforms to remove content brought to their attention. One on four key components of EU-directed content moderation
Removal
The process of taking down or disabling access to online content that is deemed to be in violation of platforms’ terms of service or national legislation. It involves the actual elimination of the content from the platform. One on four key components of EU-directed content moderation
Flagging
Involves the use of large-scale algorithmic systems and human to identify digital objects that may contain terrorist content or other inappropriate content. Once flagged, the content is prioritized for review by human content moderators. This process is crucial in identifying and addressing potentially harmful content. One on four key components of EU-directed content moderation
Filtering
The use of automated tools to identify and remove digital objects that violate the platforms’ terms of service. It involves the creation of mechanisms to prevent the publication of previously removed content or content deemed potentially terrorist-related or inappropriate by machine learning algorithms. One on four key components of EU-directed content moderation
European security integration
The process of coordinating and aligning security practices and policies across European countries. It involves collaboration and cooperation between European Union member states to address security challenges collectively
Public-private cooperation
The collaboration and interaction between public authorities and private entities, such as social media platforms, in addressing security challenges. It involves joint efforts and shared responsibilities in security-related activities.
Networked security
The interconnected and collaborative approach to security, where various actors, including public authorities, private companies, and international organizations, work together to address security threats.
The role of platforms in European security
Platforms, such as social media and online service providers, play a significant role in European security by being responsible for content moderation, implementing security measures, and collaborating with public authorities to address security challenges.
Private Terms of Service (ToS)
The agreements set by social media companies that serve as the basis for their content moderation decisions. These agreements have quasi-legal force and are often referred to as community guidelines.