Chapter 11: The Auditory Brian And Perceiving Auditory Scenes Flashcards
Ascending Auditory Pathways From Cochlea to Cortex
Cochlea to cochlear nucleus: ipsilateral
Main pathway is contralateral
Superior olivary complex
Structure in brain stem
- stop on ascending auditory pathway receiving signals from both cochlear nuclei
Cochlear Nucleus
Structure in brain stem that receives signals via Type I auditory nerve fiber from inner hair cells in ipsilateral ear
Inferior Colliculus
Structure in midbrain
- stop in ascending auditory pathway
Medial geniculate body
Structure in thalamus
- next stop on ascending pathway after inferior colliculus
Descending Pathways
- protective; doesn’t want to amplify loud sounds
- carry signals between auditory cortex
- modulate motile response
- protect eat from damage through acoustic reflex activation
- black task-irrelevant ascending auditory signals and pass task-relevant ones
Acoustic Reflex
Contraction of tiny molecule attached to ossicles that limit their movements in presence of loud sounds and hence prevents overstimulation of cochlea
Auditory Cortex
Part of cerebral cortex, tucked into lateral sulcus on top of temporal lobe
- consists of auditory core region, belt, and parabelt
Primary Auditory Cortex (A1)
Part of auditory core region
Tonotopic map
Arrangement of neurons within auditory brain regions such that characteristic frequencies of neurons gradually shift from lower at one end of region to higher at other end
Auditory Core Region
Part of auditory cortex, located within transverse temporal gyrus in each hemisphere
- consists of primary auditory cortex, rostrum core, and rostrotemporal core
Belt
Along with parabelt, region of cortex wrapped around and receiving signals from auditory core region
Parabelt
Along with belts, region of cortex wrapped around and receiving signals from auditory core region
Narrowly tuned neuron
Discrimination and recognition process occurs in belt and parabelt
Broadly tuned neuron
Increased amplitude = broader band
Involved in integrating component frequencies of complex sounds
- part of process of discriminating and recognizing sound sources
Regions of Interests (ROI)
Anterior ROI- responds to “what” task
Posterior ROI- responds to “where” task
Identifying sound source is important because it helps guide actions
- cochlea is organized nonotopically, with position in the cochlea representing frequency, not spatial location
- To represent the location of sound sources, the auditory system has evolved into an exquisitely sensitive method based on comparing aspects of the sound arriving at the two ears
- Polar coordinate system based on two mutually perpendicular places centered on the head is used to specify the locations of sound sources in 3-D space
Azimuth
In sound localization, location of a sound source in side-to-side dimension in horizontal plane— angle left/right of median plane
Perceiving azimuth in localization
Interaural level difference (ILD) works best for frequencies greater than 1000 Hz because the head attenuates the sound arriving at the far ear
- nonexistent when in front of behind you
- elevation and distance
Elevation
Location of sound source in up-down dimension in median plane (angle above or below horizontal plane)
Distance
How far sound source is from center of head in any direction
Minimum Audible Angle
Minimum angular separation between reference sound source and difference sound source emitting tone of same frequency that yields 75% correct judgements about relative horizontal positions of two sources
Interaural level difference (ILD)
Refers to difference in the sound level of the same sound at the two ears
Sources to the left or right of the median plane emit sounds that are more intense in the […]
Closer ear
- Acoustic shadow
Acoustic shadow
Area on other side of head form sound source in which loudness of sound is reduced because the soundwaves are partially blocked by head
- has greater effect on high-frequency sounds on low-frequency sounds
Interaural Time Difference
Difference in arrival time of same sound at two ears
*measures how quickly sound met each ear
Cone of Confusion
Hypothetical cone-shaped surface in auditory space
- when equally distant sound sources are located on a cone of confusion, their locations are confusable because they have highly similar ILD and ITD
Perceiving elevation
Human pinnate provide information used to judge elevation
- pinna-induced spectrum modification provides information about the elevation of a sound source (spectral shape cue)
- pinna funnels sounds into ear
- changes shape of sound waves
Spectral shape cue
A pinna-induced modification in a sound’s frequency spectrum
- provides information about the elevation of the sound source
Perceiving Distance
- Perceived loudness can be used to judge if the sound is relatively near or far if the listener knows the source sound level
- Humans use the effect of the inverse square law to judge familiar sound distances
- Blurring cue can be used to judge the distance of sound sources
- Echoes can provide cues in situations in which there are hard surfaces
- The loudness cue is also a result of the inverse square law as loudness increases as sound source approaches
Bats localizing sounds
Emit high-frequency sound sequence in range of 20000-100000 Hz
-they continuously track using prey-capture maneuver
Humans localizing sounds
Use echolocation to judge distance from objects
Echolocation
Sound localization based on emitting sounds and then processing echoes to determine nature and location of object that produced echoes
Doppler effect
Frequency of sound emitted by moving sound source is higher in front of sound source than behind it
- Frequency rapidly decreases as sound source passes listener
Looking while listening: vision and sound localization
- when visual and auditory information conflicts, the visual information takes precedence
- this is known as the ventriloquism effect, a form of visual capture
Ventriloquism Effect
Tendency to localize sound on basis of visual cues when visual and auditory cues provide conflicting info
Conditions needed for this to happen: localizing sounds
- Visual information should occur just before the auditory information
- Events needed to make sense when linked
- Two types of information must be physically close in the environment
Precedence Effect
Localization of sound as originating from source in direction from which sound first arrives
- minimizes effect of echoes on sound localization
Neural basis of sound localization
- Proposed neurons in coincidence detectors receive signals from left and right cochlear nuclei
- fire only if signals arrive at same time
*Localization: Medical superior olive (MSO)
Medial superior olive (MSO)
Part of superior olivary complex in brain stem
- thought it contain neurons functioning as specific ITD detection mechanism
- represents azimuth of sound sources
- fire only if signals from two cochlear nuclei arrive at same time
- similar to bipolar cells in eyes
Auditory Regularities
- unrelated auditory streams rarely start or stop at exactly the same time
- the features of a single auditory stream (frequency, amplitude, and timbre) tend to change slowly and gradually over time. An abrupt change often signals a new auditory stream
- all the frequency components of a single auditory stream tend to change in the same way at the same time (ex. By growing louder as the sound source approaches the listener)
Visual Regularities
- unrelated visual objects rarely have exactly coinciding boundaries
- the surface features of a single object (e.g color and texture) tend to change slowly and gradually across adjacent locations. an abrupt change often coincides with the boundary between two objects
- all the parts of a single object tend to change in the same way at the same time (eg. by moving together, if the object is rigid)
Auditory Scene
All sound entering ears during current interval of time
Auditory Scene Analysis
Process of extracting and grouping together frequencies emitted by specific sound sources from among complex mixture of frequencies emitted by multiple sound sources within auditory scene
Principles of sequently grouping of auditory information
The auditory system groups sound like the visual system groups patterns (Gestalt grouping principles)
Auditory grouping principles
- changing at the same time
- having similar frequency
- being temporally similar
Auditory Stream
Assortment of frequencies occurring over time that were able emitted by same sound source or related sound sources
Auditory Stream Segregation
Process of perceptual organization of auditory scene into set of distinct auditory streams
Harmonic Coherence
Harmonic frequencies of same fundamental frequency tend to be grouped together
Grouping by synchrony or asynchrony
- grouping based on synchrony or asynchrony is based on the principle of common motion
- auditory system uses these regularities as very powerful grouping principles
Sequential Grouping
- Grouping by frequency similarity
- Auditory system uses the frequency similarity of sequential tones to group those tones into a single auditory stream or into multiple separate auditory streams
Grouping by Temporal Proximity
Perceptual completion of occluded sounds
- auditory system can complete occluded sounds that are partially obscured
- Perceptual interpolation
Perceptual interpolation
- Auditory systems have evolved to automatically make “best guesses” about what’s really going on behind occluding sounds
- Making those guesses usually enable listeners to respond to environmental conditions more quickly and accurately than we other would
- can put conversation together
Sensory Substitution Device (SSD)
Any artificial aid in process of acquiring info via one sense that is usually acquired via another sense