week 2 Flashcards
❓ What is the Panopticon, and how does it function?
✅ A centralized surveillance model where a tower can see all inmates, but they cannot see the observer. This creates a state of permanent visibility, making individuals self-discipline their behavior.
❓ What is the major effect of the Panopticon?
✅ It induces a state of conscious and permanent visibility, ensuring automatic power without direct force.
❓ How is the Panopticon a “political technology”?
✅ It generalizes power relations beyond prisons, influencing everyday institutions like schools, workplaces, and governments.
❓ What is the Surveillant Assemblage?
✅ A networked, decentralized system of surveillance that collects data from multiple sources to monitor and control populations.
❓ What are “data doubles”?
✅ Digital profiles created by surveillance systems that classify, track, and influence individuals, often independent of their actual identity.
❓ How does the Surveillant Assemblage affect different social classes?
✅Poor people are subject to criminal justice and social welfare surveillance.
Wealthier individuals experience corporate and consumer tracking.
The more institutions someone interacts with, the more surveillance they experience.
❓ What is subjectivity, and how does surveillance shape it?
✅ Subjectivity refers to how individuals think, internalize norms, and perceive themselves. Surveillance influences subjectivity by defining what is considered “normal” or “abnormal.”
❓ How does disciplinary power differ from traditional power?
✅ Instead of using force or violence, disciplinary power works by making people internalize rules and self-discipline their behavior.
❓ How does social sorting differ from the Panopticon?
✅Panopticon: Individual-level surveillance through visibility & self-discipline.
Social Sorting: Macro-level categorization, where institutions classify and control populations based on strategic interests (e.g., race, risk factors).
❓ What are the stakes of “data doubles” in surveillance?
✅ Accuracy issues—decisions based on incomplete or biased data can impact opportunities, security, and rights (e.g., predictive policing, credit scores).