Lecture 26: Love & Belonging in the Age of AI Flashcards

1
Q

social chatbots

A

programs that use natural language processing to “converse” with users via text or voice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Replika

A
  • Most popular chatbot with 30 million downloads
  • Biggest download spike: April 2020
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Anthropomorphism

A

the process of attributing human-like motivations, emotions, or characteristics to real or imagined non-human entities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

three factor theory of anthropomorphism

A
  1. elicited agent knowledge
  2. effectance motivation
  3. sociality motivation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

elicited agent knowledge

A
  • Accessibility and applicability of anthropocentric knowledge
  • More likely to anthropomorphize when anthropocentric (human-related) knowledge is readily accessible and/or accessible
  • Because we’re human, anthropocentric knowledge is generally easily accessible to us
  • By default, end to use our own mental states & characteristic as guide during perception of human and nonhuman entities
  • When reasoning about other humans: egocentric bias
  • When reasoning about non-human entities: anthropomorphism
  • Correction for our egocentric tendency is effortful
  • Therefore, deficits in either motivation or cognitive capacity to engage in this effortful correction should increase both egocentric bias and anthropomorphism
  • More likely to rely on egocentric knowledge when reasoning about self-similar target
  • Similarly, anthropomorphism should be more likely when a non-human agent resembles humans in morphology, movement, behaviour, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

effectance motivation

A
  • Motivation to explain and understand the behaviour of other agents
  • We want to be able to predict, understand, and control the world around us
  • Anthropomorphism may be used to increase predictability and comprehension of an uncertain world
  • Ex. more likely to ascribe a mind to gadgets that behave in unpredictable ways
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

sociality motivation

A
  • Desire for social contact and affiliation
  • Recall that we perceive our social worlds in a way that supports our social approach goals
    Our social approach goals may drive us to anthropomorphize
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

perceiving life in a face

A
  • Animacy perception is categorical
  • “Tipping point” of animacy occurs close to the human end of the spectrum
  • Eyes are a particularly informative region for detecting life in a face
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

two dimensions of mind perception

A

agency & experience

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

agency

A

capacity to act & plan

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

experience

A

capacity to sense & feel

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

agency & experience & character perception

A

Both dimensions are associated with increased liking for a character, perceiving it as having a soul, prosocial motivation toward the character

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Uncanny valley effect

A

entities like robots that are almost but not quite humanlike evoke strong feelings of eeriness & discomfort

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

possible explanations for the uncanny valley effect

A
  • Threat to human uniqueness
  • Difficulty of categorization & cognitive friction
  • Expectancy violation: we expect entities with agency to also have experience, and we are disturbed when they do not
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

components of empathy

A
  • Cognitive perspective-taking (empathic accuracy)
  • Affective sharing
  • Empathic concern
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Cognitive perspective-taking (empathic accuracy

A

ability to identify what another person is experiencing (cold process)

17
Q

Affective sharing

A

the ability to share in the emotions of another (hot process)

18
Q

Empathic concern

A

the ability to experience feelings of sympathy and concern for another person (warm process)

19
Q

does AI have empathy?

A
  • AI may be good at the cognitive aspect, but not the affective, experiential ones
  • Parallels to psychopaths
20
Q

when are our social approach goals heightened?

A
  • When our needs for social connection are unmet
  • We may turn to unconventional sources of connection (ex. Parasocial relationships or anthropomorphized non-human entities)
21
Q

loneliness and anthropomorphism

A
  • Lonely individuals are more likely to anthropomorphize gadgets
  • Reminding individuals of a close, supportive relationship with an attachment figure decreases the tendency to anthropomorphize
  • Anxiously attached individuals are more likely to anthropomorphize
  • Anxious attachment is a stronger predictor of anthropomorphism than loneliness
  • Lonely individuals had an earlier “tipping point” of animacy (further from the human end of the spectrum)
22
Q

rise in loneliness

A
  • Loneliness is on the rise in the West
  • 2018 Cigna survey of 20,000 US adults:
  • Nearly half report sometimes or always feeling alone or left out
  • ¼ feel report that they rarely or never feel understood by others
  • ⅕ feel that they rarely or never feel close to people or feel that there are people they can talk to
  • Generation Z (adults 18-22) is the loneliest generation
  • COVID-19 pandemic further promoted feelings of loneliness & isolation
  • Shadow “loneliness pandemic”
23
Q

human reactions to bot empathy study

A
  • Participants described a difficult situation they were dealing with and received a message in response
  • Varied the actual source of message (Bing chat vs. human) & ostensible source of message (Bing Chat vs. human)
  • AI-generated responses elicited more positive reactions (vs. human-generated response)
  • Felt more heard, better understood, and more connected to the responder
  • AI had greater empathic accuracy, provided more emotional support, was less self-focused
  • But, they also felt more positively when they believed that the response came from another human
  • Devalued responses labelled as AI-generated
  • Negative effects of AI label weaker for those who held more positive attitudes toward Bing Chat & those who perceived Bing Chat to have more agency & experience (particularly for feelings of connection)
24
Q

how do interactions with Replika compare to interactions with another human?

A
  • No significant group differences in positive emotions after interaction
  • Slightly, but significantly more negative affect in the face-to-face group
  • Highest levels of self-presentation concerns the face-to-face group
  • In all 3 conditions rated partner to be moderately responsive and felt well-liked by their partner
  • Higher ratings of liking & responsiveness for human interactant
  • More liking in face-to-face vs. text chat condition
25
Q

future questions in AI & relationship research

A
  • How will the relationships with AI shape us and our motivation and capacity for fostering real human relationships?
  • How will AI revolve through their interactions with us?
  • What are the psychological and ethical implications of re-creating real relationships
  • Deathbots and the process of grief
  • Ex-partners, unrequited love
26
Q

other concerns in AI & relationship research

A
  • Data privacy
  • Potential exploitation of human vulnerabilities (ex. Loneliness, grief)
  • Financial barriers & concerns