algorithmic policing Flashcards

1
Q

Q: What is algorithmic policing?

A

A: The use of data and algorithms by law enforcement to forecast where, when, and to whom crimes may occur, enabling pre-emptive policing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Q: What is the goal of algorithmic policing?

A

A: To predict and prevent crime by analyzing historical data and deploying police resources in advance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Q: What does Feeley & Simon (1994) say about algorithmic policing?

A

A: It represents a “new penology”: a strategy focused on classifying and managing groups based on perceived dangerousness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Q: What does Zedner (2007) say about algorithmic policing?

A

A: It’s a security governance rationality that takes pre-emptive actions to change future outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Q: What are the four stages of algorithmic policing in practice?

A

A: Collecting data (e.g., arrests, location, time)
Analyzing with algorithms
Forecasting future crimes
Responding with police action

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Q: What is “social sorting” (Lyon, 2007)?

A

A: The process of classifying people into groups based on behavior or data. Those who don’t fit the norm are seen as suspicious, reinforcing social inequality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Q: What is “racializing surveillance” (Browne, 2015)?

A

A: Surveillance practices that reinforce racial boundaries, leading to discriminatory outcomes, especially for racialized groups.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Q: What does Garland (1996) say about algorithmic policing?

A

A: It leads to the criminalization of the ‘alien other’ — framing marginalized groups as inherently dangerous and unlike the dominant social group.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Q: How does algorithmic policing undermine social trust?

A

A: It assumes everyone is a potential threat, promoting the idea that no citizen is trustworthy, which erodes presumptions of innocence (UK House of Lords, 2009).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Q: Assumption 1: Data is a perfect reflection of reality — why is this flawed?

A

A: Data is based only on what’s reported or observed, which is influenced by biases in policing, underreporting, and systemic inequality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Q: Assumption 2: Algorithms are neutral — why is this flawed?

A

A: Algorithms are built by humans, and the decision about which variables matter reflects bias and values, not objectivity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Q: Assumption 3: The primacy of place — what’s the issue here?

A

A: Not all crimes are tied to place equally, yet algorithms often focus on location, leading to over-policing certain neighborhoods.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Q: Assumption 4: More policing is the solution — what’s the problem?

A

A: Over-policing may harm communities, while long-term, non-police strategies (like social services) may better address root causes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Q: Assumption 5: Technology increases accountability — is this true?

A

A: Not always. Many algorithms are privately owned, making them difficult to audit or investigate independently.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Q: What justice-related concerns are raised by algorithmic policing?

A

A: Reinforces bias and targets “usual suspects”
Masks discrimination with a veneer of objectivity
Undermines democratic accountability
Encourages privatization and lack of transparency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Q: How can algorithmic policing reproduce inequality?

A

A: By using biased data, algorithms may prioritize policing certain races, neighborhoods, or classes, enforcing structural discrimination.