lecture 5 Flashcards

1
Q

Prejudice in AI

A

AI and the algorithms we use are being trained on information from human society, and then produce results that consumers of AI use

 This might be a pathway through which systemic forms of bias are re represented to humans

 Bias towards societal stereotypes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

where do these patterns come from

A

Algorithms are created by people, and since they have biases, the algorithm will be biased in some way

o They reflect and reinforce stereotypes
o Often attributed to human search tendencies, but reflect algorithm design

 Based on marketing/business priorities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

where do patterns come from according to different disciplines

A

SOCIOLOGISTS
 Patterns reflect existing societal structures and disparities

COMPUTER SCIENTISTS
 Emerge from the data & AI computations

PSYCHOLOGISTS
 Have not been part of this conversation, but they need to be.
 Reflect human social cognition and decision processes as they interface with societal structures & AI computation
o Its about individual humans and how prejudice affects social cognition and how these individuals then interact with both society and data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

is gender inequality reflected in search output?

A

in countries with more gender inequality, “person” search was more often male

result:
exposure to search engine results in high-inequality country leads to participants assuming that men are more likely to be hired for jobs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

cycle of bias

A

Societal structures affect algorithms to affect users to act in ways that maintain societal structures.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

bias in NLP

A

Relies of digitized texts
o Biased towards whose texts are digitized

o If you go back further in time, you’ll find more sexism and overt prejudice and racism
o Training data may have systemic bisa and race proxies built in.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Bias in computer vision

A
  1. Data used to create and train algorithms
  2. Social/cultural biases in human annotators in dataset creation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Does dataset diversity affect race classification accuracy?

A

More data and more diverse data = higher accuracy for all races

> instead of higher accuracy for white faces compared to other races

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

annotator prejudice effects

A

Model trained on images from low prejudice annotators was more accurate in classifying race from faces of black individuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly