Session 9 Flashcards
Q: What is the main critique Lucy Suchman presents in “Algorithmic Warfare and the Reinvention of Accuracy”?
A: Suchman critiques the U.S. military’s use of AI and automated systems in warfare, arguing that claims of precision and accuracy obscure political responsibility, rely on discriminatory profiling, and result in indiscriminate violence.
Q: What is Project Maven?
A: Project Maven is a U.S. Department of Defense initiative launched in 2017 to use AI and machine learning to analyze drone surveillance footage and automate the identification of potential targets.
Q: How does Suchman describe the concept of “situational awareness”?
A: She critiques situational awareness as a military construct that promises perfect knowledge of threats but is fundamentally flawed, relying on discriminatory apparatuses of recognition and dehumanizing classifications.
Q: What does Suchman mean by the “reinvention of accuracy”?
A: The reinvention of accuracy refers to how military technologies conflate weapon precision with legitimate target identification, masking the political and ethical violence embedded in the targeting process.
Q: What role does feminist and critical security studies theory play in Suchman’s analysis?
A: Suchman uses feminist and critical security studies to reveal how algorithmic warfare is embedded in racialized, gendered, and political logics that determine who becomes “targetable” and “killable.”
Q: What solution does Suchman propose to the dangers of algorithmic warfare?
A: She calls for rejecting technological solutionism and redirecting resources away from automated warfare toward diplomacy, social justice, and accountable forms of global security.
Q: What are loitering munitions and why are they controversial?
A: Loitering munitions are expendable drones that autonomously search for and attack targets. They are controversial because of the uncertainty around the level of human control and the potential for fully autonomous targeting without human oversight.
Q: What is a drone swarm in military terms?
A: A drone swarm is a coordinated group of uncrewed aerial vehicles (UAVs) that communicate and operate collectively, often without direct human intervention, to perform surveillance, reconnaissance, or offensive military operations.
Q: What are the three main drivers behind the development and proliferation of autonomous drones?
A: Strategic (great power competition), operational (efficiency, speed, and precision in warfare), and economic (lower costs compared to manned systems).
Q: What are the key legal concerns associated with autonomous drones?
A: Autonomous drones challenge fundamental principles of International Humanitarian Law (IHL), including distinction between civilians and combatants, proportionality, and the requirement for meaningful human control.
Q: What ethical issues are raised by the use of autonomous drones?
A: Ethical concerns include the delegation of life-and-death decisions to machines, undermining human dignity, and removing moral responsibility from human operators.
Q: What is the main technological risk associated with the use of AI in autonomous drones?
A: AI systems used in drones are brittle, prone to errors, can be manipulated or hacked, and often struggle to accurately distinguish between legitimate targets and civilians, particularly in complex environments.
Q: What is the main argument of Elke Schwarz’s article “From Blitzkrieg to Blitzscaling”?
A: Schwarz argues that the logic and practices of Venture Capital (VC) investment are reshaping military norms, procurement processes, and defense strategies, prioritizing rapid growth, profit, and disruption over ethical and democratic accountability.
Q: What is “blitzscaling” and how is it applied in the defense sector?
A: Blitzscaling refers to the VC-driven strategy of prioritizing rapid, exponential growth over efficiency and accountability. In the defense sector, it pushes startups to scale quickly by disrupting traditional procurement processes and accelerating the adoption of new military technologies.
Q: Name two key defense startups that embody VC influence in the military domain.
A: Anduril Industries and Palantir Technologies are two prominent VC-backed defense startups that have secured major U.S. defense contracts and shaped military innovation toward AI-enabled systems.
Q: What are the primary narratives used by VC-backed defense companies to legitimize their growing influence?
A: Narratives of urgency and crisis (e.g., competition with China), the need for bureaucratic reform, technological inevitability, and patriotism/democratic defense are used to justify rapid adoption of their technologies.
Q: What ethical risks does Schwarz associate with the influx of VC in the defense sector?
A: Schwarz warns that VC logics prioritize speed, profit, and market dominance over ethical considerations, democratic accountability, and long-term security. This can erode public oversight and increase the risk of conflict escalation.
Q: How has the U.S. defense sector structurally adapted to accommodate VC interests?
A: Through procurement reforms like the Adaptive Acquisition Framework (AAF) and Other Transaction Agreements (OTAs), which reduce oversight and allow faster, more flexible contracting tailored to startup timelines.
Q: What is Denise Garcia’s main argument in the introduction of Artificial Intelligence to Benefit Humanity?
A: Garcia argues that the militarization and weaponization of AI threatens global peace, human dignity, and international stability, calling for urgent global governance and cooperation to prevent autonomous killing.
Q: What does Garcia refer to as the “third revolution in warfare”?
A: The development and deployment of autonomous weapon systems, following the first (gunpowder) and second (nuclear weapons) revolutions in warfare.
Q: What is “transnational networked cooperation” according to Garcia?
A: A collective effort by states, scientists, civil society, and private actors to create global governance frameworks for military AI, beyond traditional state-centric models.
Q: What ethical and legal concerns does Garcia associate with autonomous weapons?
A: They risk violating international humanitarian law principles (distinction, proportionality, precaution), delegating life-and-death decisions to machines, and lowering the threshold for war.
Q: What is “common good governance” as introduced by Garcia?
A: A governance model aimed at creating global public goods (such as peace and security) through inclusive and cooperative efforts involving states, civil society, scientists, and other actors.
Q: Why does Garcia argue that the regulation of military AI is urgent?
A: Because the speed of AI militarization is outpacing the development of international legal norms, increasing global instability, and threatening the future of human-centered security.
Q: What is the main argument of Eric Schmidt’s article on AI and national security?
A: Schmidt argues that the rapid development of AI is transforming both global economic competition and international security, intensifying great power rivalries, particularly between the U.S. and China.
Q: How is AI impacting global security according to Schmidt?
A: AI is enhancing threats across the security spectrum: cyberattacks and disinformation, conventional warfare escalation, and the potential destabilization of nuclear deterrence.
Q: What are the two major features of global digital network platforms that Schmidt identifies as national security concerns?
A: 1) Their tendency toward consolidation, giving few actors global influence. 2) Many nations’ reliance on platforms designed and hosted in rival countries, creating vulnerabilities.
Q: What strategic approach does Schmidt recommend regarding the U.S.-China technology relationship?
A: Schmidt advocates for selective decoupling to protect national security interests, while maintaining scientific collaboration and commercial interdependence where beneficial.
Q: What are Schmidt’s key concerns about AI-enabled warfare?
A: AI’s speed, autonomy, and unpredictability increase the risk of unintended escalation, reduce opportunities for de-escalation, and blur the line between surveillance, targeting, and lethal action.
Q: What policy measures does Schmidt suggest to maintain U.S. leadership in AI?
A: Increased federal investment in R&D, public-private partnerships, safeguarding AI talent, strengthening cybersecurity, and pursuing international agreements to limit destabilizing uses of AI.
Q: What is the main argument of the chapter “Technological Change and Grand Strategy”?
A: The chapter argues that technological change and grand strategy are mutually constitutive: technological innovation shapes the goals, instruments, and sources of grand strategy, while grand strategy can drive or constrain technological development.
Q: What are the five key features of technological change identified in the chapter?
A: (1) It creates new opportunities and shifts the strategic environment.
(2) It has distributional effects.
(3) It can be endogenous or exogenous.
(4) It generates uncertainty.
(5) It requires complementary assets (skills, infrastructure).
Q: How does technological change affect the sources of grand strategy?
A: It influences domestic coalitions, economic resources, and dominant narratives, potentially empowering or weakening political actors and reshaping a state’s strategic needs.
Q: How can technological change impact the goals of grand strategy?
A: It can render certain strategic goals obsolete or newly achievable by altering the strategic environment, such as how Western technological superiority forced China and Japan to open up in the 19th century.
Q: How can grand strategy influence technological change?
A: States may actively invest in specific technologies to gain strategic advantages (e.g., U.S. nuclear triad, stealth technology), which can lead to direct military benefits and indirect civilian “spin-offs.”
Q: What is a potential risk of grand strategy-driven technological innovation?
A: Excessive focus on military innovation can distort national economies, limit civilian technological development, and generate long-term economic and political costs.