Multisensory Integration Flashcards
Perception to action
Psychophysics- present sensory stimuli and report if they felt anything
Lighting a match- seemingly easy, reaching, grasping, manipulating objects
Seemingly hard logic puzzles
Computers are good at hard tasks
Reach/grasps- robots are bad at seemingly easy tasks
Moravec’s paradox
1998
High level cognitive reasoning tasks- easy for robots
Low level cognitive tasks, perception- hard
Computers show adult level performance on intelligence test/ playing checkers
To give them skills of 1yrs old concerning perception and mobility
Bias- we assumed if these tasks were easy to us they would be simple for robot
Lighting match
Anaesthetise finger- loss of touch sensation from fingers - sight still there
Does not affect motor control
When sensation is blocked, much harder to light the match
25s- block sensation
5s- normal
Shows vision and sensation very important for simple task
Multi sensory integration
Touch and vision integrate
Challenges in multi sensory integration
Transforming representation from different senses to common representational space
Integrating info from different senses into coherent percept -> mismatch
Reference frame
Representation schemas of information from different schemas
Snake game:
Player perspective: coordinates of snake in the game-top-down view
Snake perspective: see the world through its eyes- turn left/ right
Problem: sensory input in player, control in snake-> need to know info about body layout of snake
Reference frames for different senses
Vision = eye centred/ retinal- location of visual stimulus on retina
Audition= head centred- location of sound source in respect to ears
Touch= body centred- location of tactile stimulus on skin
Need to convert between the reference frames and to external space
Coordinate transformation
Dog barking and seeing dog= separate reference frames
How to convert between two and to know the difference between the two you have to know
Angle between two
Converting between frames- have to know position and orientation of body parts
The body schema
Spatial coded: position of each body part in external space
Modular: different body parts processed in different brain regions
Updated with movement: automated and continuous tracking of body posture
Adaptable: changes when body changes
Supramodel: combines input from proprioception, touch and vision
Coherent: resolves perceptual conflicts
Interpersonal: observed actions are represented within the same body schema
Types of body representation
Body schema: sensorimotor representation that guides action
Body image: body percept, body concept, body affect -> how we think/ feel about our current body
Doe Body posture affect perception
Temporal judgement task
Stimulate both hands in random order, pots have to stretch fingers of hand stimulated
Arms crossed / arms uncrossed
When arms crossed- ppts mix up which hand was stimulated
Solving task- do not need input from body schema when arms uncrossed
Body schema interferes with basic perception
How does a body schema develop
6mnth old
Schema starts to interfere with tactile orienting
Shown when crossing/uncrossing feet and buzzing one
4mnths- no difference I’d crossed, reach for right foot
6mnth- baby reaches for correct foot, more correct when feet uncrossed
No arm crossing
Tactile discrimination task- determine which finger was vibrated
Visual distract either on same hand or other hand
Distractors led to response delays
Congruent distracts lead to longer delays than incongruent
62ms vs 20ms
Arm crossing
Tactile tumulus in same side of body, visual stimulus on different side
Effect of visual distractor moves with the hand during arm crossing
Cross modal interactions mediated by body schema
Opposite response
67ms vs 3ms
Peri personal space (PPS)
Space immediately surrounding our bodies
Objects in PPS can be grasped and manipulated immediately
Tool use
Extending the body
Tools are incorporated into the body schema
Cross modal congruency effects apply during tool use
No crossing of body parts only tools= same delay effects
Tools become part of the body schema, represented same in brain
Alice in wonderland syndrome
Distortion in perception of size
Body parts might appear smaller (micromatognosia) or bigger than they are (macrosomatognosia)
Affects whole body
Associated with childhood and migraines
Autotopagnosia
Unable to locate body parts
Loss of spatial unity of body
Patients can name body parts but order is lost
Can’t point out where body part is, unaware of how it looks
Finger agnosia- fused percept of finger, can not indicate what finger was stimulated
Phantom limb
Still feel presence of limb even after loss of limb
May include agency/ movement
Associated with pain
Can change size over time- shrink / telescoping
Double dissociation
Causes decomposition or concept of body representation
Cross modal neurons
Neuron fires when you put an object within range/ touches of the hand- respond when seen is touched / object moves near hand
Neuron responds to visual and tactile stimuli
Visual receptive friend moves with the position of the hand
Modified by body schema
Neurons incorporating tools
When monkey holding tool response space expands
Expansion of Peripersonal space during tool use reflected in neural response
As response field expands, body schema encompasses the tool
Integration problem
To see something- represented in external space
To hear- auditory input- represented in a different reference frame
Have to view according to both these inputs -> sensory conflict
Sensory conflict
Different senses might provide conflicting information about a sensory stimulus
Needs to be resolved
Testing conflicts between vision and touch
Judge size of object by vision and touch
Look through reducing lens
If we follow vision object looks a lot smaller
If we go by touch object feels a lot bigger
Visual capture
Assess the size of a cube via pointing, feeling, drawing
Vision dominates perceived object size- visual capture- trust visual sense much more
Sensory hierarchy?
Auditory can dominate vision
I report no. of visual flashes seen
Auditory beeps played during flashes
No. of auditory beeps determines reported no. of visual flashes
Modality precision hypothesis
Modality with highest precision (lowest uncertainty) is chosen dependent on task
Spatial task- choose vision as highest accuracy
Temporal task- audition much higher accuracy
Sensory uncertainty occurs due to
Perceptual limits- visual resolution determined by spacing of photoreceptors in fovea
Neural noise- synaptic noise
Cognitive resource limits- attention
Sensory modality changes
Emst and banks (2002)
Created artificial conflict- judge height of bar
Visual height and haptic height
Virtuality reality set up- bar not there- illusion
Change height of bar, modify uncertainty by adding visual noise
Haptic- force feedback device, changes height of bar, judging size of bar by touch
Vision and haptic input can be conflicting
Normative model
How a problem should be solved -> optimal solution
Process model- how a problem is actually solved- based on data
How to solve problem of sensory integration
Pick integration method- minimises sensory uncertainty
Haptic (haptic uncertainty) + vision (visual uncertainty) -> integrated signal (smallest possible, combined uncertainty)
Need to integrate both to come up with best possible answer
Integrating probabilities
Consider the probability
When considering bar height- high variance and high sensory uncertainty
-> estimated height has large range
Different senses different levels of uncertainty
When a sensory conflict occurs there are 2 distributions- a haptic and visual
-> both come with their own levels of uncertainty
Optimal estimate
Combine lower uncertainty and smaller variance -> estimate alone
Combine both= lower variance
Combined estimate is biased towards the visual estimate as it has lower uncertainty than haptic estimate
Optimal weights
Can be calculated from variance (sensory uncertainty) of the visual and haptic distribution
Minimal variance
Combined sensory variance
Integration of info from multiple sources
Causes uncertainty to decrease
Experimental evidence
Present virtual bar with sensory conflict, looks visually longer than how it feels (haptic)
Compare against bar without conflict and see what length is judged
Visual= 60mm, haptic= 50mm
Determine point of subjective equivalence
Manipulate sensory uncertainty of visual feedback
Before visual noise, perception of bar length biased towards visual input
Below 50- use haptic and about 50- visual input
Increased visual noise, perception of bar determined by both visual and haptic input
Nosier the visual input the more we rely on haptic
Human performance
Follows optimal sensory integration rules
Should move from visual to haptic capture when visual noise is large
Do we always integrate info optimally?
Have to know uncertainty for optimal integration:
Can be hard to estimate
Easier in sensory perception than cognitive reasoning
Calculations can be intractable/ take long time:
Heuristics are suboptimal
Good enough solutions often satisfactory
Integrate info optimally
Good estimating sensory noise, bad estimating cognitive noise
task- estimate tilt of gratings
Add sensory noise, add cognitive noise
Ppts integrate sensory noise optimally but cognitive noise sub optimally
Are probabilities encoded
Uncertainty represented with prob distributions
Confidence signals
Representation of full probability- normal distribution
Correspondence problem
How do we know there is only one stimuli in the first place
How do we know where are not 2- one that we can see and one that we can hear