The Social and Emotional Brain, Lecture 7 Flashcards
The Social Brain Hypothesis
Human primates have unusually large brains for body size.
Brain is one of the most “expensive” organs in terms of running costs.
It has evolved to deal with the complex information we are presented with in a largely social world.
Computational demands of living in large, complex societies that selected for large brains.
Social Cognition
Focus on how people process, store and apply information about other people and social situations.
We infer, interpret, encode, decode social information and social situations.
Includes perception of emotions and facial expressions, perception of eye gaze direction of others, prediction of the thoughts underlying the behavior of others.
Reading Faces
Faces are not only a subject for visual perception because they are social objects, and they carry information about another person’s emotional states, intentions (eye-gaze), membership in social categories (race, gender), disposition (trustworthiness).
Capgras Syndrome
A person believes that their loved ones have been replaced by identical looking imposters or body-doubles.
Consciously recognize the person but lack emotional response to them (Ellis & Young, 1990).
Is Face Processing Innate?
Fantz, 1961 - “looking chamber” demonstrated that infants preferred the real face, looked slightly less at the scrambled face and ignored the control pattern.
Johnson et al., 1991: Newborns were tested within an hour after birth. Results showed that Newborns also orient to face-like patterns. Newborns are sensitive to the structure of the human face.
Reid et al., 2017 examined fetal head turns to visually presented to upright and inverted face-like stimuli. Dots/Squares are enough to elicit responses to facial configuration – used red lights as it penetrates the tissue. ALSO used ultrasound to see how the baby behaves when light is shone.
Results: Fetuses (at around 34 weeks of gestation) are more likely to engage with stimuli featuring an upright face-like configuration than with an inverted configuration.
Visual system development in infants and innate preference for faces:
EEG/ERP and fMRI studies show that the large-scale organization of visual brain areas in 4-6-month-old infants is already like adult brains but is subsequently refined through development (experience, maturation).
Infants can detect faces in around 290ms (N290 – infant version of the N170, a face-specific component).
Foetuses (at around 34 weeks of gestation) are more likely to engage with stimuli featuring an upright face-like configuration than with an inverted configuration.
Facial expression recognition:
FNIRS study by Di Lorenzo et al., 2019 with 5-month-old participants showed that the right occipital area selectively responds to faces, indicating activation of the face processing network at 5 months. However, sensitivity to facial emotions is immature at this age.
Visual cliff experiment and social referencing:
Sorce et al., 1985, found that 12-month-old infants could decode emotional expressions and use them for decision making in the visual cliff experiment.
Caregiver’s facial expression of emotion influences the infants’ decision, known as social referencing.
Eye gaze detection in social cognition:
Information from the eye region is a key social cue for understanding others, including distinguishing between emotions, establishing dyadic communication, orienting attention to critical objects, and giving clues about intention.
From around 4 months, infants can shift direction of their eyes to look at someone’s face, and newborns prefer to look at faces that engage them in mutual gaze.
Baron-Cohen et al., 1995, found that individuals with autism have intact perception of eye gaze but have difficulty using eye gaze information to predict behavior.
Brain Bases of Eye-Gaze Detection
Study: Hoffman & Haxby, 2000
Findings:
STS (Superior Temporal Sulcus) is activated in the eye gaze detection task. It is involved in changeable features. Lesions impair the ability to detect gaze direction.
FFA (Fusiform Face Area) is activated in the face identity task. It processes unchangeable features of facial features.
Brain Bases of Eye-Gaze Detection
Study: Pelphrey et al., 2005
Findings:
Participants (typically developing and autistic) show STS activity in the eye-gaze detection task.
Autistic participants show no difference between incongruent versus incongruent trials, indicating that perception of gaze shift is not linked with its mentalistic significance.
Reading Minds
Definition: Empathy is an emotional reaction to our understanding of another person’s feelings, and the ability to infer emotional experiences. It has two components: affective and cognitive.
Mechanisms/Theories:
Mirroring (Simulation Theory, ST) - affective component
Mentalising (Theory of Mind, TOM) - cognitive component
Mentalising and Theory of Mind (ToM)
Definition: Mentalising is the ability to infer mental states (desires, feelings) and intentions of others. ToM is concerned with the cognitive aspects of empathy, such as reasoning about mental states and attributing mental states to others.
Study: Vollm et al., 2005
Findings:
Both empathy and ToM activate the medial prefrontal cortex, temporoparietal junction, and temporal poles.
ToM-specific activity occurs in the orbitofrontal cortex.
Empathy-specific activity occurs in the amygdala.
What are the temporal lobes involved in?
Language and semantic memory, possibly representing/activating semantic schemas that specify current social and emotional contexts.