Lecture 26: Love & Belonging in the Age of AI Flashcards
social chatbots
programs that use natural language processing to “converse” with users via text or voice
Replika
- Most popular chatbot with 30 million downloads
- Biggest download spike: April 2020
Anthropomorphism
the process of attributing human-like motivations, emotions, or characteristics to real or imagined non-human entities
three factor theory of anthropomorphism
- elicited agent knowledge
- effectance motivation
- sociality motivation
elicited agent knowledge
- Accessibility and applicability of anthropocentric knowledge
- More likely to anthropomorphize when anthropocentric (human-related) knowledge is readily accessible and/or accessible
- Because we’re human, anthropocentric knowledge is generally easily accessible to us
- By default, end to use our own mental states & characteristic as guide during perception of human and nonhuman entities
- When reasoning about other humans: egocentric bias
- When reasoning about non-human entities: anthropomorphism
- Correction for our egocentric tendency is effortful
- Therefore, deficits in either motivation or cognitive capacity to engage in this effortful correction should increase both egocentric bias and anthropomorphism
- More likely to rely on egocentric knowledge when reasoning about self-similar target
- Similarly, anthropomorphism should be more likely when a non-human agent resembles humans in morphology, movement, behaviour, etc.
effectance motivation
- Motivation to explain and understand the behaviour of other agents
- We want to be able to predict, understand, and control the world around us
- Anthropomorphism may be used to increase predictability and comprehension of an uncertain world
- Ex. more likely to ascribe a mind to gadgets that behave in unpredictable ways
sociality motivation
- Desire for social contact and affiliation
- Recall that we perceive our social worlds in a way that supports our social approach goals
- Our social approach goals may drive us to anthropomorphize
perceiving life in a face
- Animacy perception is categorical
- “Tipping point” of animacy occurs close to the human end of the spectrum
- Eyes are a particularly informative region for detecting life in a face
two dimensions of mind perception
agency & experience
agency
capacity to act & plan
experience
capacity to sense & feel
agency & experience & character perception
Both dimensions are associated with increased liking for a character, perceiving it as having a soul, prosocial motivation toward the character
Uncanny valley effect
entities like robots that are almost but not quite humanlike evoke strong feelings of eeriness & discomfort
possible explanations for the uncanny valley effect
- Threat to human uniqueness
- Difficulty of categorization & cognitive friction
- Expectancy violation: we expect entities with agency to also have experience, and we are disturbed when they do not
components of empathy
- Cognitive perspective-taking (empathic accuracy)
- Affective sharing
- Empathic concern
Cognitive perspective-taking (empathic accuracy
ability to identify what another person is experiencing (cold process)
Affective sharing
the ability to share in the emotions of another (hot process)
Empathic concern
the ability to experience feelings of sympathy and concern for another person (warm process)
does AI have empathy?
- AI may be good at the cognitive aspect, but not the affective, experiential ones
- Parallels to psychopaths
when are our social approach goals heightened?
- When our needs for social connection are unmet
- We may turn to unconventional sources of connection (ex. Parasocial relationships or anthropomorphized non-human entities)
loneliness and anthropomorphism
- Lonely individuals are more likely to anthropomorphize gadgets
- Reminding individuals of a close, supportive relationship with an attachment figure decreases the tendency to anthropomorphize
- Anxiously attached individuals are more likely to anthropomorphize
- Anxious attachment is a stronger predictor of anthropomorphism than loneliness
- Lonely individuals had an earlier “tipping point” of animacy (further from the human end of the spectrum)
rise in loneliness
- Loneliness is on the rise in the West
- 2018 Cigna survey of 20,000 US adults:
- Nearly half report sometimes or always feeling alone or left out
- ¼ feel report that they rarely or never feel understood by others
- ⅕ feel that they rarely or never feel close to people or feel that there are people they can talk to
- Generation Z (adults 18-22) is the loneliest generation
- COVID-19 pandemic further promoted feelings of loneliness & isolation
- Shadow “loneliness pandemic”
human reactions to bot empathy study
- Participants described a difficult situation they were dealing with and received a message in response
- Varied the actual source of message (Bing chat vs. human) & ostensible source of message (Bing Chat vs. human)
- AI-generated responses elicited more positive reactions (vs. human-generated response)
- Felt more heard, better understood, and more connected to the responder
- AI had greater empathic accuracy, provided more emotional support, was less self-focused
- But, they also felt more positively when they believed that the response came from another human
- Devalued responses labelled as AI-generated
- Negative effects of AI label weaker for those who held more positive attitudes toward Bing Chat & those who perceived Bing Chat to have more agency & experience (particularly for feelings of connection)
how do interactions with Replika compare to interactions with another human?
- Unstructured “get-to-know-you” conversation
- No significant group differences in positive emotions after interaction
- Slightly, but significantly more negative affect in the face-to-face group
- Highest levels of self-presentation concerns the face-to-face group
- In all 3 conditions rated partner to be moderately responsive and felt well-liked by their partner
- Higher ratings of liking & responsiveness for human interactant
- More liking in face-to-face vs. text chat condition