ch seven: auditory, touch, taste & smell Flashcards
if a tree falls in the forest, does it make a sound?
> what is sound?
Physically, yes. The tree hitting the ground still produced pressure changes in the air or other medium
Perceptually, no. No one was around the experience the sound of the tree hitting the ground
sound waves
- what creates sound waves?
The alternating increases and decreases in pressure create SOUND WAVES
Although pressure changes move outward from the speaker, the air molecules at each location move back and forth in the same place
sound waves
What is transmitted is the pattern of increases and decreases of pressure that eventually reach the ear
see diagram 7a
components of the auditory signal; frequency (pitch)
- measure in __?
physical (perceptual)
1) frequency (pitch): the rate at which waves vibrate, measured as cycles per second, or herts (Hz). Frequency roughly corresponds to our perception of pitch
see picture 7a
components of the auditory signal; amplitude (loudness)
- measured in what?
2) amplitude (loudness): the intensity of sound, usually measured in decibels (dB). amplitude roughly corresponds to our perception of loudness
see picture 7a
components of the auditory signal; complexity (timbre)
3) complexity (timbre): most sounds are a mixture of frequencies. the particular mixture determines the sound’s timbre, or perceived uniqueness. timbre provides information about the nature of a sound. for example, timbre allows us to distinguish the sound of a trombone from that of a violin playing the same note
see picture 7a
the audibility curve
- what frequency (Hz) are humans most sensitive to?
- what is loudness dependent on?
Humans are most sensitive to frequencies between 2,000-4,000Hz which is the frequency range for most speech sounds – range of hearing is 20-20,000 Hz
Loudness depends on both frequency and sound pressure
frequency comparison
- human ear can perceive sounds anywhere between ___ Hz?
The human ear can perceive sounds anywhere between 20-20,000Hz
Dogs: 40-60,000Hz Cats: 100-32,000Hz Elephants: 16-12,000Hz Bats: 10,000-150,000Hz Rodents: 70-150,000Hz
the ear
see picture 7a
vestibular system
- what organ detects acceleration?
see picture 7a
OTOLITH ORGANS DETECT ACCELERATION
(a) You can see a person standing still, and the otolith is centered over the hair cells.
(b) The person has just started moving. The otolith is left behind by inertia, causing the hair cells to bend. This opens ion channels, just like in the auditory system, causing the cells to depolarize or hyperpolarize.
the cochlea
- cochlear partition contains what?
- hair cells are the receptors for?
- cilia produces what signal?
Cochlear partition contains the organ of Corti
Hair cells are the receptors for hearing and the cilia produce electrical signals (i.e., transduction)
hair cells
- what type of movement causes ion channels to OPEN or CLOSE?
Movement of the hair cilia in one direct (right) causes ion channels to open
Movement in the other direction (left) causes ion channels to close
Back and forth movement causes bursts of electrical signals
pitch perception: place coding
- which codes high frequency sounds?
- which codes lower frequency sounds?
The base of the basilar membrane codes high frequency sounds, the apex (end) codes lower frequency sounds
putting it all together (7a)
- see diagram
1) pinna catches sound waves and deflects them into the external ear canal
2) waves are amplified and directed to the eardrum, causing it to vibrate,…
3) … which in turn vibrates ossicles
4) ossicles amplify and convey vibrations to the oval window
5) vibration of the oval windows sends waves through cochlear fluid…
6) … causing the basilar and tectorial membranes to bend….
7) …which in turn causes the cilia of outer hair cells, embedded in the tectorial membrane, to bend. This bending generates neural activity in hair cells
coincidence detectors
- signal from the right ear travels __
- signal from the left ear travels ___
- signals would be combined by ___?
The signal from the right ear would travel farther along the neurons, whereas the signal from the left ear would travel less far.
These signals would combine by spatial summation at coincidence detector A in the olive, enabling you to locate the source of the sound.
7a ppt recsp / stimulus for hearing
- what is the stimulus for hearing?
- ## what is sound made up of?
the stimulus for hearing is sound
but sound is made up of the vibrations of air molecules - it has to get picked up by auditory receptors in order to produce transduction and pass that information on to the rest of the brain
auditory receptors are located in the cochlea (inner ear cochlea) and they consist of hair cells
hair cells are one type of receptor and they basically take the wave form thats coming to the cochlea translate it to an action potential
in order for the auditory system to know what frequency or pitch is coming to the ear - it depends where along the cochlea / hair cells are stimulated and the basilar membrane (THE PLACE THEORY)
- diff hair cell will code for diff frequencies
it is organized into a tonotopic map (high frquency at the base to lower frequnecy (apex))
visual vs auditory localization
- image on the retina contains what?
- visual system has the ability to code for?
The image on the retina contains spatial information
- the visual system has the ability to code for space
- it is really good in spatial coding
Different frequencies of sound are coded at different points on the basilar membrane, but contain no spatial information
- basilar membrane just contains frequencies
- it cant do that because it has no spatial information
notes:
the place stimulated by the tweet and meow on the basilar membrane doesn’t tell the auditory system where the sound is coming from
binaural cues (two primary binaural cues)
- which one is effective for lower frequency?
- which one works well for high freq sounds?
/ / binaural cues: cues that can be derived from sound information arriving at both ears
Interaural time difference (ITD) - is based on the fact that there can be differences in the time it takes a sound to arrive at one ear compared to the other (left ear vs right ear)
- Effective for lower frequency sounds
Interaural level difference (ILD) - is based on the difference in sounds pressure (amplitude or loudness of the sound) levels reaching the two ears (right vs left ear)
- Works well for high frequency sounds
our ears are located in different sides of our head
- these 2 cues are relying upon is a comparison between the two ears when you hear a sound
interaural time difference
Uses differences in time of sound arrival to each ear (right before left)
Works best for lower frequency sounds
if you have sound coming directly in front of you, those sound wave will travel both of our ears
- time difference will be nothing because the sound will probably arrive at the exact same time (our eyes are positioned at the same exact of our head)
if a sound is coming to our right side (the sound will travel to the right ear first before it goes all the way to our head and gets to the left ear)
- the time will be different
interraural level difference
Localization for high frequency sounds is accomplished using intensity differences
The head creates a acoustic shadow which causes differences in intensities detected by the ears
- high frequency creates this acoustic shadow
- idea that the pressure will be different between the left or right ear
coincidence detectors
- signal from right ear will travel __ the neurons
- signal from left ear would travel ___
signals would combine by ____ at coincidence detector
The signal from the right ear would travel farther along the neurons, whereas the signal from the left ear would travel less far.
These signals would combine by spatial summation at coincidence detector A in the olive, enabling you to locate the source of the sound.
(happens in the superior olivary nuclei)
primary auditory pathway
see picture 7b
the auditory network
consists of the ear, ear canal, ossicles, and cochlea
and from the cochlea the auditory neurofibre coming from the hair cells sends information to the cochlear nuclei - they are ipsilateral (receiving information for the same side of the ear or head)
right after it gets processed, it shares that information with both sides of the brain
so from the cochlear nucleus, we have information that gets sent to both left and right superior olivary nuclei
and then they processed information from the both ears further to the inferior colliculus (in the midbrain / tectum)
- it is a nucleus of cell body and it receives information from the superior olivary nucleus
- allows us to orient our attention automatically to sounds in the environment
and then it flows to the thalamus (sensory relay of the brain)
- part of the thalamus that receives information is the medial geniculate nucleus
- ;lateral “” for the visual system
and then information gets passed on to the cortex from the thalamus to the primary receiving auditory region in the temporal lobe (both left and right lobe from both left and right ears)
auditory system is a BILATERAL SYSTEM
SONIC MG
- superior olivary nuclei
- inferior nuclei
- mediual geniculate nuclei
note that the visual system being contralateral
here we have both ears getting processed in both the left and right sides of the brain
only bilateral system in our senses
–
since our auditory system is a bilateral system
how does somebody lose hearing in one ear?
- the part of the system that needs to be damaged for us to be DEAF in only ONE ear would be THE COCHLEAR NUCLEI
cochlear nuclei is the only part of the brain that receives information about one year
- it is embedded in the bottom region of the midbrain
- you’re gonna have more damage than jus losing hearing in one year
how bout both ears? or sound attenuation
- could be damage to the ear canal, to the ossicles(everything will sound like water) , or hair cells (you will end up with complete hearing loss)
you need bilateral brain damage in order to have complete hearing loss
primary auditory cortex
- how is it organized?
- organized from __ to ___ freq
- what r belt and parabelt regions?
Primary auditory cortex (A1) - Activated by “pure tones” - TONOTOPICALLY organized
- organized from high freq to lower freq
- A1 is procesing pitch or frequency information coming out from different sources
Signals travel from A1 to the BELT and PARABELT regions
- belt is basically A2 - it will process those sounds in a more complex way (complex sounds)
- parabelt will process even more complex sounds (speech or instrument sounds)
Belt and parabelt regions are “higher level” auditory processing regions
- Respond to more complex sounds – e.g., speech
tonotopic organization (7b)
(A) basilar membrane
(B) tonotopic organization
hierarchy of sound processing
- what is the most basic frequency of sounds
Primary area A1 is responsive only to basic sounds and their modulation; surrounding areas become activated only by intelligible speech.
note:
- A1 is the most basic frequency of sounds
- belt regions have complex sounds and pure sounds
- parabelt - speech / musical sounds
two streams 7b auditory stream
Auditory processing for ventral “what” and dorsal “where”
Anterior part of the core and belt respond to the sound pattern (WHAT)
Posterior parts respond to the location of the sound (WHERE)
a double dissociation; brain damage
Patient JG (WHAT SYSTEM) damage to the temporal cortex resulted in poor recognition, but intact localization - going down through the temporary lobe into the frontal lobe which is the part that will be damaged here - fine localization, but severe deficit in identidfication (recognition) because his what system is damaged in the frontal lobe
Patient ES (WHERE SYSTEM)
damage to fronto-parietal cortex was impaired at localization, but intact recognition
- affects the stream that goes up towards the parietal lobe to the frontal lobe for processing localization
- damage that is done will inhibit pathway in reaching its destination
- she would be able to process what those objects (sounds) are, but very terrible localization (she doesn’t know if its coming from her left / right / towards her)
–
they are independent in a sense that they don’t need each other tp function
plasticity in auditory cortex
plasticity - auditory system can change itself and can experience dependent plasticity
- the more experience you have, changes or shapes the way the auditory cortex is organized
Neurons in auditory cortex are tuned based on input and experience
Monkeys were trained to discriminate tones near 2,500 Hz
Following training more of A1 was dedicated to processing tones near 2,500 Hz
notes:
Training in a specific frequency increases the cortical area devoted to that frequency
Trained monkeys to discriminate between two frequencies around 2,500 Hz
After training monkeys were very good at telling the difference between the two tones
Also much more cortical space for neurons to respond to those two frequencies
the somatosensory system (tactile system)
The somatosensory system has both interoceptive and exteroceptive functions
the somatosenstansory system is the tactile system
- it’s not one system / contains a variety of system
- it is hapsis but theres more to it
main components of the somatosensory system; nocioception
Nocioception - the perception of pain and temperature
- pain system
main components of the somatosensory system; hapsis
Hapsis - perception of objects using touch and pressure
- the touch system (everything that we touch on our hands and skin goes through hapsis)
main components of the somatosensory system; proprioception
Proprioception - knowledge of the position of your limbs in space
- where are body parts are in all different situations
-
main components of the somatosensory system; balance
Balance - controlled by the vestibular system in the inner ear
- keeping our balance that requires feedback
the skin
The heaviest organ and one of the largest in size
- Warning function
- Keeps fluids and organs in tact
- Keep bacteria and dirt out
- Shields us from the outside elements
The skin is one of the largest organs in our body
- Contains receptors for pain and touch
- Also important for keeping bacteria, viruses, etc., out
the skin; parts
1) Epidermis - is the skin’s outermost layer
- Comprised of dead skin cells
- it sheds / ends up being dust in our house
2) Dermis - below the epidermis
- if you cut, you would bleed
3) Subcutaneous - below the dermis
- contains more receptors, veins, capillaries and such like that
–
besides allowing us to tocuh skin, the skin is rlly important keeping everything on the inside and keeping everything on the outside, outside (bacteria / viruses)
somatosensory receptors
- somatosensory system has 4 different receptors
we have these free nerve endings that spread out
it has its own pathway devoted to it