Chapter 12 - Hearing and the environment Flashcards
1- Some terminology
- Sound localization
- The ability to identify the location of a sound
source in a sound field. - Precedence effect
- When a sound is followed by another sound
separated by a sufficiently short time delay
(below the listener’s echo threshold), listeners perceive a single auditory event. - Auditory stream analysis
- The ability to separate each of the sound sources and separate them in space.
- Perceptual grouping
- Putting parts together into a whole
2- Comparing Location Information for Vision and Hearing
Read text??
I think:
Location cues are not contained in the receptor cells like on the retina in vision; thus, location for sounds must be
calculated.
3- Auditory Localization
- Auditory space: surrounds an observer and exists wherever there is sound
- Tones with the same frequency activate the
cochlea (hair cells) in the same way regardless of where they are coming from. - Localization cues
- Researchers study how sounds are localized in
space by using: - Azimuth coordinates: position left to right
- Elevation coordinates: position up and down
- Distance coordinates: position from observer (Most difficult)
- On average, people can localize sounds:
- Directly in front of them most accurately
- To the sides and behind their heads least accurately
- Location cues are not contained in the receptor cells like on the retina in vision; thus, location for sounds must be
calculated.
4- Binaural Cues for
Sound Localization
Binaural cues: location cues based on the comparison of the signals are received by the left and right ears.
* Interaural time difference (ITD): difference between the times that sounds reach the two ears
- When distance to each ear is the same, there are no
differences in time.
- When the source is to the side of the observer, the times will differ.
Interaural level difference (ILD): difference in sound pressure level reaching the two ears.
* Reduction in intensity occurs for high frequency sounds for the far ear.
- The head casts an acoustic shadow.
- This effect does not occur for low frequency sounds.
Cone of confusion
(??)
5- Monaural Cue for Sound Location
Monaural Cue for Sound Location
ILD and ITD are not effective for judgments on elevation, since in many locations they may be zero.
Experiment investigating spectral cues
* Listeners were measured for performance locating sounds differing in
elevation.
* They were then fitted with a mold that changed the shape of their pinnae.
* Right after the molds were inserted, performance was poor for elevation but was unaffected for azimuth.
* After 19 days, performance for elevation was close to original performance.
* Once the molds were removed, performance stayed high.
* This suggests that there might be two different sets of neurons—one for each set of cues.
6- The Physiology of Auditory Localization
Jeffress Neural Coincidence Model
* Neurons are wired so they each receive signals from the two ears.
* Coincidence detectors
- Inferior colliculus
* ITD detectors
- Place code
* ITD tuning curves
Broad ITD Tuning Curves in Mammals
Coding for localization based on broadly tuned neurons
* in the right hemisphere that respond when sound is
coming from the left.
* in the left hemisphere that respond when sound is
coming from the right.
The location of a sound is indicated by relative
responses of these two types of broadly tuned neurons.
7- Cortical Mechanisms
of Location
*Area A1 is involved in locating sound.
- Neff’s research on cats
*Posterior belt area is involved in locating sound.
- Recanzone’s research on monkey neurons
*Anterior belt is involved in perceiving sound
What and Where Auditory
Pathways
What, or ventral stream, starts in the anterior portion of the core and belt and
extends to the prefrontal cortex – used to identify sounds.
What = temporal lobe
Where, or dorsal stream, starts in the posterior core and belt and extends to the
parietal and prefrontal cortices – used to
locate sounds.
Where= parietal lobe
Evidence from neural recordings, brain damage, and brain scanning support these findings.
8- Hearing Inside Rooms
- Direct sound: sound that reaches the listener’s ears
straight from the source. - Indirect sound: sound that is reflected off of environmental surfaces and then to the listener.
- When a listener is outside, most sound is direct;
however, inside a building, there is direct and indirect
sound.
9- Perceiving Two Sounds That Reach the Ears at Different Times
Experiment by Litovsky et al.
* Listeners sat between two speakers: a lead speaker
and a lag speaker.
* When sound comes from the lead speaker followed by
the lag speaker with a long delay, listeners hear two
sounds.
* When the delay is decreased to 5:20 msec, listeners hear the sound as only coming from the lead speaker: the precedence effect.
10- Architectural Acoustics
Architectural Acoustics
The study of how sounds are reflected in rooms.
Factor that affects perception in concert halls
* Reverberation time: the time it takes sound to
decrease by 1/1000th of its original pressure
- If it is too long, sounds are “muddled”.
- If it is too short, sounds are “dead”.
- Ideal times are around two seconds.
Other factors that affect perception
* Intimacy time: time between when sound leaves its source and when the first reflection arrives
- Best time is around 20 ms.
* Bass ratio: ratio of low to middle frequencies
reflected from surfaces
- High bass ratios are best.
* Spaciousness factor: fraction of all the sound
received by listener that is indirect
- High spaciousness factors are best
11- The Auditory Scene: Separating Sound Sources
Auditory scene: the array of all sound sources in the
environment.
Auditory scene analysis: process by which sound sources in the auditory scene are separated into individual
perceptions.
* Does not happen at the cochlea since simultaneous
sounds are together in the pattern of vibration of the
basilar membrane
Heuristics that help to perceptually organize stimuli
Simultaneous grouping
* Onset time: sounds that start at different times are likely to come from different
sources.
* Location: a single sound source tends to come from one location and to move
continuously.
* Similarity of timbre and pitch: similar sounds are grouped together.
- Sequential grouping
- Compound melodic line in music is an example
of auditory stream segregation. - Experiment by Bregman and Campbell
- Stimuli were in alternating high and low tones.
- When stimuli played slowly, the perception is
hearing high and low tones alternating. - When the stimuli are played quickly, the listener
hears two streams, one high and one low.
Experiment by Deutsch: the scale illusion or melodic channeling
* Stimuli were two sequences alternating between the right and left ears.
* Listeners perceive two smooth sequences by grouping the sounds by
similarity in pitch.
* This demonstrates the perceptual heuristic that sounds with the same
frequency come from the same source, which is usually true in the environment
Experiment by Warren et al.
* Tones were presented
interrupted by gaps of silence or by noise.
* In the silence condition,
listeners perceived that the
sound stopped during the gaps.
* In the noise condition, the
perception was that the sound continued behind the noise.
Effect of past experience
* Experiment by Dowling
* Melody “Three Blind Mice” is played with notes alternating between octaves.
* Listeners find it difficult to identify the song.
* However, after they hear the normal melody, they can then hear it in the
modified version using melody schema.
12- Connections Between Hearing and Vision
Visual capture or the
ventriloquist effect: an
observer perceives the
sound as coming from the
visual location rather than
the source for the sound.
Two-flash illusion
Physiology
The interaction between vision and hearing is multisensory in nature.
Thaler et al. (2011): used expert blind echolocators to create clicking sounds
and observed these signals activated the bran.