Week 7 Flashcards
Speech in Noise
Vision helps to resolve ambiguity Few “lip readers” but visual speech greatly improves perception in noise Automatic Greater than sum of parts
Warning Signals
Faster response to multimodal stimuli Flashing indicator Beeping indicator Flash/Beep more reliable faster
Flavour
“Flavour is in the brain” Taste Smell Somatosensory Modulators Sight Sounds Smells Expectation
A Unified Perceptual Experience
“… it is surely one of the great remaining scientific puzzles just
how it is that signals from such completely separated and
wholly dissimilar sensory epithelia as the haircells of the
cochlea, the photoreceptors of the retina and the corpuscles
of the skin can be integrated centrally to form such a seamless
unitary perceptual world”.
Molholm and Foxe, 2010 (p 1709)
Overview - Perception
A primary role of the brain • Determine what is “out there” in the world • Decide on best/most appropriate action/behaviour • Food to approach? • Danger to avoid? • Path to navigate? • Potential mate to interact with? Survival depends on speed and accuracy with which an organism can evaluate external events and properly react to them To determine what is “out there” need to • Gather information • Interpret that information • Represent the information
Multisensory Integration
Overview • Perception • Multisensory Perception • Example - Redundant Targets Effect Factors influencing integration • Example - Stream-Bounce Effect and Top-Down Factors Neuroscience of multisensory integration • Superior Colliculus • Superior Temporal Sulcus • Example – McGurk Effect • Posterior Parietal Cortex • Example – Flash/Beep Effect
Multisensory Perception
Objects and events in the world generate information in multiple ways light sound mechanical chemical
Multisensory Perception
… evaluate external events and properly react
to them
Many info sources so many senses good
• Different modalities can substitute when individually
compromised (eg vision lost in the dark)
• Different fields of operation – touch/smell/taste for up close and
vision/hearing for distance
• Resolve ambiguities – 2 things may sound the same but look
different; boost signal to noise
Multisensory Integration
Multisensory integration enhances our ability to perceive
and understand our environment, enabling us to better
interact with our surroundings
• Combine info from multiple cues to improve stimulus
detection and discrimination – increase speed and
accuracy (e.g. Audio-visual warning)
• Resolve perceptual ambiguities (e.g. Stream-Bounce)
• Create novel representations (e.g. Flavour)
Redundant Targets Effect
• Miller (1982) • Speeded response to either audio (A), visual (V), or audio-visual (AV) target Redundant since additional stimulus doesn’t provide any additional information 2 Models 1. Statistical facilitation - independent (parallel) processing 2. Neural coactivation - integrated signals
Redundant Targets Effect
Statistical Facilitation
• Both elements of AV stim processed along
independent channels
• One that reaches output stage first triggers
response
• On average, the time of the winner will be less than
the time for either racer
Redundant Targets Effect
Neural Coactivation
• Both components of a redundant signal influence
response on a single trial
• Activation from different channels combine in
satisfying a single criterion for response initiation -
activation builds over time until some criterion is
reached
• Activation builds faster when provided by two
sources rather than one
Redundant Targets Effect process
Visual signal processed + Auditory signal processed -> decision-> response
Visual + Auditory -> audio-visual signal processed -> decision-> response
Redundant Targets Effect cont.
• responses to redundant signals are too fast to be
explained as the faster of two responses to
individual signals
• The easiest way to explain the speed of responses
to redundant signals is to assume that signals
jointly contribute to the process of producing the
response
• Neural Coactivation - MSI
MSI – Key Issue
• Sensory environment is complex – there are
multiple sources in each modality – eg lots of visual
objects and lots of sounds
• Key function of MSI dissociate between stimuli
from different sources and single source
• Integration
• what to bind
• what to segregate
Factors Influencing MSI
- Temporal Coincidence
- Spatial Coincidence
- Temporal patterning
- Crossmodal correspondence
- Stored knowledge
- Context
- Recent experience
- Expectation
- Attention
Factors Influencing MSI- bottom-up processing: objective
- Temporal Coincidence
- Spatial Coincidence
- Temporal patterning
Factors Influencing MSI- top-down: subjective
- Stored knowledge
- Context
- Recent experience
- Expectation
- Attention
Bottom-up Factors
• Temporal Coincidence – things that happen at the same
time
• Spatial Coincidence – things that happen at the same
place
• Temporal patterning – things that are correlated over
time
Strong cues that stimuli were caused by the same event
and so belong together
Top-down Factors
Information that is already present in our brain
influences how crossmodal signals are combined
• Stored knowledge
• Recent experience
• Context
• Expectation
• Attention
Top-down: Context
Stream-bounce; 3 blocks of 200 trials each Zeljko and Grove Block 1 – all no sound Block 2 – mixed sound/no sound Block 3 – all no sound
Top-down: Expectation
When are cognitive influences (eg previous response, overall presence of the tone) impacting Does expectation lead to an early decision or bias? Typical S/B – observe entire motion sequence then provide response – subjective Want to see how percept evolves Record cursor position at each frame -1 target indicated to follow -No Sound or Sound at coincidence -Target tracked until motion stop Subjective- No Sound or Sound Objective- Stream or Bounce
Neuroscience of MSI
• Minimum 32 visual, 15 auditory, 8 somatosensory areas
identified in primate cortex
• Body control - proprioception, vestibular, vision, motor
control
• How are individual senses integrated?
• How is a unified perceptual experience created?
Multisensory neurons are
found at nearly every level
in the CNS
Multimodal Brain Areas
intraparietal sulcus
superior temporal sulcus
superior colliculus
inferior colliculus
Superior Colliculus
• Reflexive orienting to stimuli in
contralateral space
• Produces motor actions that are
guided by sensory stimuli
• Converging visual, auditory and
somatosensory projections from
numerous cortical and subcortical
sources
• Inputs – retina, cortex, IC, spinal cord
• Outputs – motor control of eyes, ears
and head
• Multilayered structure
• Superficial layers are visual – optic tectum in non-mammals
• Deeper layers are multisensory
• MS cells
• Inputs from 2 or more sensory systems – A/V, V/S, A/V/S
• Overlapping receptive fields
• Can respond to single sensory input – but weakly
• Preferentially (stronger) response to multiple inputs
• Receptive field – a region of sensory space that a sensory
neuron preferentially responds to stimuli in
• Central role in integration of info from different
modalities and generation of spatial orienting
responses
• Important as a MS structure, but also good as a
general model of MSI
• Early cellular neurobiology work by Stein et al
• Single cell recordings – MS cells in SC of cats
Multisensory Enhancement
MS neurons – response to appropriate multisensory
stimuli exceeds the response to individual unisensory
inputs and can even exceed the sum of the
unisensory inputs (super-additivity)
But – get suppression of inappropriate (incoherent or
misaligned) stimuli (sub-additivity)
3 drivers of MS enhancement or super-additivity:
1. Spatial rule
MS stimuli must occur at the same region of space
2. Temporal rule
MS stimuli must reach the MS cell at the same time
3. Principle of Inverse effectiveness
Enhancement is greater for weak stimuli than strong
Spatial Rule unimodal
Unimodal
Visual or auditory alone
Weak response
Spatial Rule multimodal
Multimodal Large spatial offset Depressed response – less than just visual Smaller spatial offset Unimodal response – just the visual Spatial Coincidence Strong multisensory response
Temporal Rule unimodal
Unimodal
Visual or auditory alone
Weak response
Temporal Rule multimodal
Multimodal Asynchronous AV Unimodal response of whichever stimulus comes first Simultaneous AV Strong multisensory response
Superior Temporal Sulcus (STS)
• STS as a region involved in audio–visual speech
processing
• STS may be generally involved in binding auditory
and visual inputs, regardless of whether they
contain speech or biological motion
STS and the McGurk Effect
• Beauchamp et al. (2010)
• Use fMRI to identify MS area – left posterior STS
responded to both auditory and visual speech
• Show mismatched AV speech (McGurk) with and
without TMS of Left STS
• Control 1 – auditory only with and without TMS –
check if effect is speech in general or MSI
• Control 2 – TMS of second site
1. No TMS – mostly illusion
2. TMS of left STS - illusion down to 50% of trials
3. Control 1 – TMS and auditory only – no effect on
speech perception
4. Control 2 – TMS of second site – no effect on
McGurk susceptibility
Plasticity
• Stroke patient with ablated left posterior STS
• Recovery and rehab reported good understanding
of speech but it was effortful
• Report experiencing McGurk effect
• Right STS activation and larger volume than age
matched controls
Posterior Parietal Cortex (PPC)
• PPC plays a critical role in functions related to attention
allocation for unimodal and multisensory processing
• Reference frame remapping by the PPC may be critical for
aligning inputs to facilitate integration
• PPC is also ideally situated to mediate interactions between
the sensory systems by shaping processing in primary
sensory areas via feedback projections
• anodal tDCS speeds reaction times for detecting auditory,
visual and bimodal auditory–visual targets when applied
over right PPC
PPC and the Flash/Beep Illusion
• Flash/Beep (Shams et al. (2002): single flash of light
with multiple beeps
• Sound induced illusory flashing - Report 2 or more
flashes when 2 or more beeps (fission)
• Kamke et al (2012)
• Illusory flash perception is typically only reported
on a proportion of trials suggesting that stimulus
characteristics alone do not determine the illusory
percept
TMS to one of 2 PPC areas
(angular gyrus and
supramarginal gyrus) or S1
(control)
Unisensory Cortex and MSI
• Also modulation of low-level sensory cortex by other
modalities
• Human fMRI studies by Calvert et al. found that the
substantial improvement of auditory speech perception
when the speaker’s face was visible was accompanied
by a significant response enhancement in auditory
cortex
• Romei et al. demonstrated that auditory stimulation
can decrease the threshold of perceived phosphene
induced by a single pulse TMS applied over the occipital
pole
• Single flash being misperceived as two with two beeps
accompanies an enhancement of visual activity in V1
(Watkins et al., 2006)
• Double flash being misperceived as a single flash due to
a single beep relates to a decrease in V1 activity
(Watkins et al., 2007)
• ERP work showing interactions between auditory and
visual stimuli at very short latency (46ms to 150ms)