W3L1 - Face processing and emotion Flashcards
Facial Emotional Recognition Evidence for Innateness
Universal facial expression hypothesis: Evolved to recognize emotions
- Expressions are similar in closely related species (> Innate decoding)
- Emotional expressions are evident in deaf and blind people (But blind are less proficient at posing)
Facial Emotional Recognition: Theory for importance
Ability to decode intention conveys benefit for survival
Facial Emotional Recognition Evidence for Cross-cultural similarity.
High cross-cultural (both literate and preliterate) agreement in judgments of emotions in faces
6/7(contempt) basic emotions
What is the Anger Superiority Effect. Study
Finding the face in the crowd
- Some people find an advantage for happy faces
- Target Absent consistent take longer than Target Present
- In both Target Present/Absent, angry faces identified consistently faster
What is the Anger Superiority Effect. Study’s Caveat.
1) May depend on stimulus set: Larger set size take longer
2) May depend on feature strength: E.g. whiteness of teeth
What supports the feature strength hypothesis of identifying emotions
Facial Action Coding (FAC): Combination of components indicating specific emotions
Emotion perception of morphed faces reveals categorical perception:
Anger, fear and sadness = Top half of the face
Happiness and disgust = Bottom half of face
Surprise = Equally recognisable
> Suggest non-holistic processing, from subset of face
Are features are used to encode facial expression?
Impaired decoding. Composite effects found suggest holistic processing
Identification of Upper Expression: 1.) Aligned vs. misaligned happy = SLOW 2.) Aligned vs. misaligned angry = FAST > Effects disappear with inversion > Indicative of holistic processing (susceptible to same effects)
From the composite task, what can we say about recognition and expression perception
Recognition and Expression perception are independent.
Expression judgements for composites are unaffected by their component faces’ identities, vice versa (i.e. task on expression, identity does not affect)
Can prosopagnosics decode expressions of emotion
Mixed on Developmental Prosopagnosics
Yes:
- Can label basic facial expression and even decode difficult to categorise emotions
No:
- Deficits in facial expression recognition
> Suggest using individual features to decode?
Can prosopagnosics decode expressions of emotion. Evidence for non-holistic.
Impaired decoding. Composite Effect: Identity or emotion match (blocked top/bottom)
- Aligned v Misaligned face no difference for prosopagnosics unlike controls.
- Both identity and emotions alike.
What do models of face recognition suggest for expression
Independence.
Bruce & Young:
Dedicated route for Expression
Haxby et al. (2000):
Superior Temporal Sulcus (STS) = Expression
Ventral Temporal Route (include FFA) = Identity
Do we have different locations for different emotions
Yes. Probably.
Behavioural Evidence to suggest different locations for different emotions
Dynamic changes in muscle activity over time for different facial expressions suggest decoding over time
Disgust, anger, fear move more quickly.
Physiological Evidence to suggest different locations for different emotions. Activation Study
Activation: 120ms - Fast (low-dirty road) 170ms - Detailed perception 300ms - Conceptual knowledge Unlike FFA, dynamic network
Physiological Evidence to suggest different locations for different emotions. MEG study
MEG - Response to happy/fear/neutral in identity/emotion task
90ms - Orbito-frontal response to emotion without attention
170ms - Right-insula response to emotion with attention
220ms - Identity processing
> Emotions before Identity Amygdala