Similarities and differences between senses Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

structure of the mammalian ear

A

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

structure of the mammalian ear research

A

Knudsen (2004)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Knudsen (2004)

A

Experience exerts a profound influence on the brain and, therefore, on behavior. When the effect of experience on the brain is particularly strong during a limited period in development, this period is referred to as a sensitive period. Such periods allow experience to instruct neural circuits to process or represent information in a way that is adaptive for the individual. When experience provides information that is essential for normal development and alters performance permanently, such sensitive periods are referred to as critical periods. Although sensitive periods are reflected in behavior, they are actually a property of neural circuits. Mechanisms of plasticity at the circuit level are discussed that have been shown to operate during sensitive periods. A hypothesis is proposed that experience during a sensitive period modifies the architecture of a circuit in fundamental ways, causing certain patterns of connectivity to become highly stable and, therefore, energetically preferred. Plasticity that occurs beyond the end of a sensitive period, which is substantial in many circuits, alters connectivity patterns within the architectural constraints established during the sensitive period. Preferences in a circuit that result from experience during sensitive periods are illustrated graphically as changes in a ‘‘stability landscape,’’ a metaphor that represents the relative contributions of genetic and experiential influences in shaping the information processing capabilities of a neural circuit. By understanding sensitive periods at the circuit level, as well as understanding the relationship between circuit properties and behavior, we gain a deeper insight into the critical role that experience plays in shaping the development of the brain and behavior.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

tonotopic arrangement of hair cells

A

see notes

• Sensitivity the same across the basilar membrane  - Deeper sounds = greater wavelength
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

location on basilar membrane defines which hair cells (auditory receptor cells) respond to different sound frequencies

A

• Cross section of the Organ of Corti (inner ear): ca 20,000 hair cells along basilar membrane
• Inner hair cells - 95% of afferent projections
- Tallest stereocilia in contact with tectorial membrane

see notes

Fettiplace and Hackney (2006)
• Stereocilia displaced, K+ channels stretch open, influx of K+ into hair cell
• Depolarisation (= receptor potential)
- Opening of Ca2+ channels, influx of CA2+ triggers ntm release to first-order auditory interneuron

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Tympanic ears evolved at least 5 times in the vertebrate line (Schnupp and Carr, 2009; Ladich and Schulz-Mirbach, 2016)

A

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Highly schematic representation of the amniote phylogenetic tree over 400 million years to illustrate the approximate time of origin of particular features of auditory systems

A

• Mammals - IHC/OHC - inner/outer hair cells
• Lizards - high/low freq hair cells
• Birds, crocs - THC/SHC - tall/short hair cells
• Parallels THC/SHC and IHC/OHC:
○ THCs and IHCs less specialised and receive strong afferent innervation
○ OHC innervated by few efferent fibres (5%), SHC receive no afferent innervation
• Amniotes arose from earliest tetrapods early in the Palaeozoic and inherited from them simple hearing organ with cochlear amplifier in stereovillar bundles
• Apart from lineages to turtles and Tuatara, that remained primitive, 3 main lineages to modern amniotes distinguished
• Splitting off first were mammalian ancestors, which gave rise to both egg-laying monotremes and marsupial-placental line
• Later, archosaur line originated and led to dominant land organisms of the Mesozoic
• Only crocodile-alligator and bird groups survived to modern times
• The last group to split off was the lizards and snakes within the lepidosaurs
• The tympanic middle ear originated independently in all groups during Triassic, initiating the evolution of unique configs of papillae, with all groups showing papillar elongation and hair-cell specialisations
• Because the hair-cell popns in the monotreme and marsupial-placental mammal groups are so similar, they arose before lineages diverged
• Same applies to birds and Crocodilla
• In lizards there are family-specific variations, suggesting that these hair-cell popns arose soon after Triassic
- Because monotremes don’t have coiled cochlea, coiling developed in marsupial-placental lineage

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

comparing vision and hearing

A

• Červeny et al. (2011)
- Red foxes hunting small animals show a specific behaviour known as ‘mousing’. The fox jumps high, so that it surprises its prey from above. Hearing seems to be the primary sense for precise prey location in high vegetation or under snow where it cannot be detected with visual cues. A fox preparing for the jump displays a high degree of auditory attention. Foxes on the prowl tend to direct their jumps in a roughly north-eastern compass direction. When foxes are hunting in high vegetation and under snow cover, successful attacks are tightly clustered to the north, while attacks in other directions are largely unsuccessful. The direction of attacks was independent of time of day, season of the year, cloud cover and wind direction. We suggest that this directional preference represents a case of magnetic alignment and enhances the precision of hunting attacks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

hearing (Konishi, 1973)

A

• Sound - movement of air particles set in motion by vibrating structure (e.g. string of instrument, membranes/other structures in the body)
• Wave chars of sound - alternate waves of compression and rarefaction of air, molecules move back and forth from regions of high pressure to low pressure - higher freq = shorter wavelengths
• Measures of sound - freq (reciprocal of wavelength) and amplitude (measured in decibels) - pressure of air on tympanum
• Most birds head up to 5-6kHz and the barn owl has exceptional high-freq hearing, with char freqs of 9-10kHz
- More than half of auditory neurons sensitive in range of 5-10kHz

see notes

Heffner and Heffner (2007)

• Audiograms are measured behaviourally - report if hear sound or not 
• Threshold for tone when correctly selected > 50%
• SPL - sound pressure level (set at 0 for 1kHz)
• Curve normalised to standard value where at 1 kHz value set to 0 decibel - threshold can have neg values 
• Define depending at which decibel level threshold cut-off set to define diff freq limits that depend on cut-off threshold  - Used in diff comparisons or used to characterise changes in hearing over age or between indvs/ in many other applications 

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

psychoacoustics: a subfield of psychophysics (Dent, 2017)

A
  • Audiograms: most common assessment of animal hearing
    • Train using classical and operant conditioning
    • Measurement of detection thresholds: stim varied in freq and intensity played back to animal - if responds in majority of trials correctly, stim above threshold
    • E.g. budgerigar (Melopsittacus undulatus) learns to peck key to start variable waiting interval - trained with rewards and range of loud signal to respond correctly (shaping) - during testing phase, other signal variations interspersed - when can hear signal should peck right key - if not withhold response
    • Set at 0 for humans and other audio adjusted accordingly
    • Can compare whether animal has more sensitive hearing
    • Whether can hear noises with much lower intensity threshold, e.g. with barn owl or cat as compared to humans or whether humans have more sensitive hearing than for example turtles, which have much less sensitive hearing than humans
    • Also judge width of function and steepness
  • Max, min - make a lot of comparisons between audiograms

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Electrophysiology: AEP measurements as non-invasive method for studying hearing functions (Mann et al., 2007)

A

• AEP: auditory evoked potentials to determine sensitivity threshold for diff sound freqs - electrode on top of head - used for marine mammals and fish - play back sound and record it in diff intensities and show here for 400 Hz record when get reliable signal
• Faster and no need to train animal to auditory stim
• Audiograms generated from AEPs instead of ratios of correct behav responses
• Hearing in 8 Canadian freshwater fish: best in fish with connection between inner ear and swim bladder
• Diff impacts of anthropogenic noise pollution
- Either get very strong or no signal at chosen intensities - variation in diff species of fish

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Owls moves its head to face a visual or sound target (Hill et al., 2010)

A

• Movement in space can be represented by angular deviation in 2 directions
○ Horizontal (azimuth) and vertical (elevation) - characterise how the head moves in space
• Align head to face sound source
• Trained to sit on perch
- Coil on head so in electromagnetic field all movements measured

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Owls moves its head to face a visual or sound target (Hill et al., 2010) research

A

Knudsen and Konishi (1978)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Knudsen and Konishi (1978)

A

Auditory units that responded to sound only when it originated from a limited area of space were found in the lateral and anterior portions of the midbrain auditory nucleus of the owl (Tyto alba). The areas of space to which these units responded (their receptive fields) were largely independent of the nature and intensity of the sound stimulus. The units were arranged systematically within the midbrain auditory nucleus according to the relative locations of their receptive fields, thus creating a physiological map of auditory space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

how do owls localise sound source (Knudsen et al., 1979)

A

• Search coil on top of owl’s head lies at intersection of horizontal and vertical magnetic fields - movement induces current in search coil
• First viewing direction fixated with sound from zeroing speaker - play the sound back from position
• Head movement towards sound from target speaker measured and accuracy determined
• Head flick delay - 100ms, but sounds of 75ms also elicit flick (open-loop condition)
• Done in dark and soundproof chambers
• Rewarded if locates sound so turns head towards origin of sound
- 1. The dynamics and accuracy of sound localization by the barn owl (Tyto aIba) were studied by exploiting the natural head-orienting response of the owl to novel sound stimuli. Head orientation and movement were measured using an adaptation of the search coil technique which provided continous high resolution azimuthal and elevational information during the behavior. 2. The owls responded to sound sources with a quick, stereotyped head saccade; the median latency of the response was 100 ms, and its maximum angular velocity was 790~ The head saccade terminated at a fixation point which was used to quantify the owl’s sound localization accuracy. 3. When the sound target was located frontally, the owl’s localization error was less than 2 ~ in azimuth and elevation. This accuracy is superior to that of all terrestrial animals tested to date, including man. 4. When the owls were performing open-loop localization (stimulus off before response begins), their localization errors increased as the angular distance to the target increased. 5. Under closed-loop conditions (stimulus on throughout response), the owls again committed their smallest errors when localizing frontal targets, but their errors increased only out to target angles of 30 ~ . At target angles greater than 30 ~ , the owl’s localization errors were independent of target location. 6. The owl possesses a frontal region wherein its auditory system has maximum angular acuity. This region is coincident with the owl’s visual axis.

see notes

• Location accuracy as function of position of target speaker
• Target speaker in front - error less than 2 degrees 
• 0 = position directly in front 
• Y = number of errors across trials 
• Degree by which animal manages to accurately locate speaker 
• Further away sideways at 70 degrees speaker at side of animal - less accurate can locate speaker - error in range of 10 degrees in which it misses to accurately pinpoint and face speaker - facing direction shifted by 10 degrees - Closer speaker frontally in frontal hearing field, less degrees of errors there will be - around 2 degrees of error in vertical and horizontal direction 

see notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

precise prey localisation requires both ears

A
• Auditory space in front of owl
	• (L/R - degrees of azimuth, +/- degrees of elevation)
	• Close one of ears - systematic shift
	• Position and angle of accuracy 
- Ears not quite symmetrical

see notes

Knudsen (2002)

• Sound waveform in right ear delayed and attenuated relative to that in left - reach left ear sooner 
• Correspondence of interaural timing diff ITD (b) and interaural level diff ILD ( c) values with locations in space for 6 kHz sound - sufficient to detect where in space relative to owl sound source located
• Plot interaural timing difference (ITD) required to detect stim depending on where in hearing space it is
• Right in front - sound reaches both ears at same time
• Higher up but close to central line = v small
• Further away, more sideways sound source is in hearing space longer delay becomes 
• Sane for ILD - sound travels over longer distance, some attenuation - tiny diff in intensity between sound arriving at one ear and other ear - systematically varies across entire aural field - no diffs if sound source located directly in front of animal - A bird sings and you turn to look at it — a process so automatic it seems simple. But is it? Our ability to localize the source of a sound relies on complex neural computations that translate auditory localization cues into representations of space. In barn owls, the visual system is important in teaching the auditory system how to translate cues. This example of instructed plasticity is highly quantifiable and demonstrates mechanisms and principles of learning that may be used widely throughout the central nervous system

see notes

17
Q

precise prey localisation requires both ears research

A

Schnupp (2009)

18
Q

Schnupp (2009)

A

Although ears capable of detecting airborne sound have arisen repeatedly and independently in different species, most animals that are capable of hearing have a pair of ears. We review the advantages that arise from having two ears and discuss recent research on the similarities and differences in the binaural processing strategies adopted by birds and mammals. We also ask how these different adaptations for binaural and spatial hearing might inform and inspire the development of techniques for future auditory prosthetic devices.

19
Q

optic tectum is located in the midbrain of the bird brain

A

• Sensory info conveyed through midbrain to thal and further into cerebrum
• Auditory midbrain located on inner side of optic tectum (MLD - mesencephalicus lateralis dorsalis)
• Green = songbird’s cortex, which dominates bird brain anatomy, functions similarly to human cortex - outer brain shell responsible for controlling perception and some aspects of complex behav - used to think songbirds had only thin and small cortex - believed nearly entire green region controlled only instinctual behav - bird’s brain through to be nearly all instinct driven
• Dark and light blue regions = brain stem, which sit toward back of bird’s neck and regulate unconscious behavs - serve as relay stations to cerebral regions (green, brown and orange) - darker blue = midbrain - processing station between thal (light blue), which collects and distributes sensory info, and cerebrum, responsible for higher brain functions such as vocal syntax - midbrain also transmits info between thal and spinal cord
• Yellow = cerebellum, which regulates fine movement controls
- Orange and brown (brain cut in half lengthways) = basal ganglia, which (with cortex) control learning and sequencing of movements - previously believed that primitive operations of small region of brain extended throughout green area - Jarvis believes that bird’s basal ganglia also involved in memory and general learning, and suggests that at some point soon functions added to widely accepted view of function

see notes

• Ear with hair cells located along basilar membrane, and depending which sound impinges on tympanum it will be amplified and leads to vibrations in inner ear
• In cochlea where basilar membrane deflected depending whether low or high pitched sound deflections happen at apex or base of basilar membrane
• Hair cells don’t have axons they connect directly to it first order into neurons who have long axons that reach cochlear ganglion and from interneurons project into the cochlear nucleus
• Number of connections shown - into the superior olive into the lateral lemniscus and from there into the MDL in the midbrain  - From the MDL auditory info passed on to other brain areas inc. cerebrum
20
Q

Measuring the interaural time difference (ITD) in the cochlear nucleus

A

• Jeffress model: sound location computed from diffs in delay and intensity between 2 ears (Jeffress, 1948)
• Carr and Konishi (1988) confirmed with studies of barn owl basic premises of model
○ Interaural time difference is an important cue for sound localization. In the barn owl (Tyto alba) neuronal sensitivity to this disparity originates in the brainstem nucleus laminaris. Afferents from the ipsilateral and contralateral magnocellular cochlear nuclei enter the nucleus laminaris through its dorsal and ventral surfaces, respectively, and interdigitate in the nucleus. Intracellular recordings from these afferents show orderly changes in conduction delay with depth in the nucleus. These changes are comparable to the range of interaural time differences available to the owl. Thus, these afferent axons act as delay lines and provide anatomical and physiological bases for a neuronal map of interaural time differences in the nucleus laminaris.
• Cochlear nucleus contains imp structure - coincidence detection mechanism
• Neuron that listens to both signals coming from both ears - takes signal time to travel - feeds signal into branches - if arrive at same time - location of sound source can be coded
• Located in the hind brain
• ITD coded in cochlear nucleus using coincidence mechanism - each neuron project signal in away segregated and can be traced back to correspond to diff ITD
- Info mapped onto structures in MLD in tectum

see notes

21
Q

Measuring the interaural time difference (ITD) in the cochlear nucleus research

A

Smith and Price (2014)

22
Q

Smith and Price (2014)

A

Sound source localization is critical to animal survival and for identification of auditory objects. We investigated the acuity with which humans localize low frequency, pure tone sounds using timing differences between the ears. These small differences in time, known as interaural time differences or ITDs, are identified in a manner that allows localization acuity of around 1° at the midline. Acuity, a relative measure of localization ability, displays a non-linear variation as sound sources are positioned more laterally. All species studied localize sounds best at the midline and progressively worse as the sound is located out towards the side. To understand why sound localization displays this variation with azimuthal angle, we took a first-principles, systemic, analytical approach to model localization acuity. We calculated how ITDs vary with sound frequency, head size and sound source location for humans. This allowed us to model ITD variation for previously published experimental acuity data and determine the distribution of just-noticeable differences in ITD. Our results suggest that the best-fit model is one whereby just-noticeable differences in ITDs are identified with uniform or close to uniform sensitivity across the physiological range. We discuss how our results have several implications for neural ITD processing in different species as well as development of the auditory system.

23
Q

Location of sound sources are mapped in 2 dimensions onto the MLD

A

• Auditory space in front of owl
• (L/R - degrees of azimuth, +/- - degrees of elevation)
• Inner part of auditory region
• Tonotopic mapping of interneurons (according to freq tuning)
• Outer part - interneurons tuned to 6-8 kHz, but sensitive to spatial location of sound
- Move electrode along neural structures, records tonotopically mapped responses - correspond to particular positions in hearing field of owl

see notes

24
Q

Location of sound sources are mapped in 2 dimensions onto the MLD research

A

Heffner and Heffner (2016)

Knudsen and Konishi (1978)

25
Q

Heffner and Heffner (2016)

A

The ability to locate the source of a sound too brief to be either scanned or tracked using head or pinna movements is of obvious advantage to an animal. Since most brief sounds are made by other animals, the ability to localize such sounds enables an animal to approach or avoid other animals in its immediate environment. Moreover, it can be used to direct the eyes, thus bringing another sense to bear upon the source of the sound. Given the value of sound localization to the survival of an animal, it is not surprising that the need to localize sound has been implicated as a primary source of selective pressure in the evolution of mammalian hearing (Masterton et al. 1969; Masterton 1974).

26
Q

Knudsen and Konishi (1978)

A

○ 1. The influence of sound location and sound frequency on the responses of single units in the midbrain auditory area (MLD) of the owl (Tyto alba) were studied using a movable sound source under free-field conditions. With this technique, two functionally distinct regions in MLD have been identified: a tonotopic region and a space-mapped region.
○ 2. MLD units were classified according to their receptive-field properties: 1) limited-field units responded only to sound from a small, discrete area of space; 2) complex-field units exhibited two to four different excitatory areas separated by areas of reduced response or inhibition: 3) space-preferring units responded best to a certain area of space, but their fields expanded considerably with increasing sound intensities; 4) Space-independent units responded similarly to a sound stimulus regardless of its location in space.
○ 3. Limited-field units were located exclusively along the lateral and anterior borders of MLD. These units were tuned to sound frequencies at the high end of the owl’s audible range (5-8.7 kHz). They usually responded only at the onset of a tonal stimulus; but most importantly, the units were systematically arranged in this region according to the azimuths and elevations of their receptive fields, thus creating a physiological map of auditory space. Because of this latter, dominant aspect of its functional organization, this region is named the space-mapped region of MLD.
○ 4. The receptive fields of units in the larger, medial portion of MLD were of the space-independent, space-preferring, or complex-field types. These units tended to respond in a sustained fashion to tone and noise bursts, and these units were arranged in a strict frequency-dependent order. Based on this last property, this region is named the tonotopic region of MLD.
- 5. Because of the salient differences in the response properties of their constituent units, it is argued that the space-mapped region and the tonotopic region are involved in different aspects of sound analysis.

27
Q

Parallel processing of time (interaural time difference ITD) and intensity (interaural level different ILD)

A

see notes

• Hearing info processed in 2 parallel pathways, one through magnocellular nucleus and laminar nucleus
• Another that receives input from first order interneurons of ear passes through angular nucleus - codes for intensity diffs - segregation of IDT and IDL at early stage
• Mapped into ITD and interaural intensity diffs that klater on converge in MDL in midbrain - Allows to reconstruct location of sound source as spatial map in auditory space
28
Q

Parallel processing of time (interaural time difference ITD) and intensity (interaural level different ILD) research

A

Manley et al. (1988)

29
Q

Manley et al. (1988)

A

The nucleus ventralis lemnisci lateralis pars posterior (VLVp) is the first binaural station in the intensity-processing pathway of the barn owl. Contralateral stimulation excites and ipsilateral stimulation inhibits VLVp cells. The strength of the inhibition declines systematically from dorsal to ventral within the nucleus. Cells selective for different intensity disparities occur in an orderly sequence from dorsal to ventral within each isofrequency lamina. Cells at intermediate depths in the nucleus are selective for a particular narrow range of interaural intensity differences independently of the absolute sound-pressure level. A simple model of the interaction between inhibition and excitation can explain most of the response properties of VLVp neurons. The map of selectivity for intensity disparity is mainly based on the gradient of inhibition.

30
Q

spatial mapping is projected to cortical areas (Grothe et al., 2010; Yao et al., 2015)

A

see notes

• Segregate sound info into ITD and ILD
• Pathway similar - have inner ear with superior olivary nucleus, the cochlear nuclei, inferior colliculus in midbrain
• MGL in thal and auditory cotex - Projections from both sides interact at level of brainstem to produce comparisons of signal that originate from both ears 

Jarvis (2009)

• Evolution of Pallium in birds and reptiles
• Example sensory (auditory) and motor (vocal) pathways in songbirds, in comparison with other vertebrates
• (a) auditory pathway in songbird showing ascending and descending input
• (b) similar auditory pathways, but sometimes with diff nomenclature used for indv nuclei, can be found for all amniotes examined
• Only sub-pathway through cochlea and lateral leminiscal nuclei shown
• once in telencephalon, parallels can be found in cell type connectivity, although pallial organisations diff and projections in amphibians mostly to striatum 
• Layers and serial organisation/hierarchical organisation very similar 
• Ear with hair cells, cochlear ganglion as first processing layer, cochlear nuclei, the leminiscal nuclei, corresponding structures, which is MLD in tectum of birds and interior colliculus s projections to what would be the equivalent of cortex in mammals that would be the cerebrum in birds - Distantly related groups of vertebrates showing lots of similarities 

see notes

• Light and sound propagate as waves that differ in freq and intensity
• Whilst light is absorbed by photoreceptor as quanta, sound vibrates internal structures of ear
• Spatial r'ships of stim in outer world coded through retinotopic mapping in visual pathways - in hearing pathways spatial r'ships largely lost - to use sound for accurate location of sound source, these need to be reconstructed in brain
• Auditory pathways parallel and serial connections, similar to vision - tonotopic maps result from arrangements of sensory interneurons in cochlea and imp binaural comparisons and reconstruction of spatial locations relative to body
• Research in birds contributed fundamental demo of neural mechanisms relevant to human hearing - Audiograms allow comparisons between species to determine how hearing can be adapted to diff tasks and ecological needs
31
Q

Simmons and Young (2010)

A

• Many species of owls and bats rely on hearing to locate prey, and their auditory systems are specialised to enable them to do that
• Owls locate sounds by comparing signals between the 2 ears: intensity for elevation and timing for azimuth
• Space-specific neurons in the external auditory nucleus of midbrain respond to sounds from particular locations - arranged in orderly way, forming map of space
• Sensory neurons from ear encode info about sound freq, time and intensity - brain uses info to compute receptive field for each space-specific neuron
• Info about relative intensity and timing of sounds at 2 ears processed in separate pathway on either side of brain
• Sound intensity compared in angular and then posterior leminiscal nuclei
• Sound time compared in laminar nucleus, in which neurons act as coincidence detectors receiving inputs from axons of left and right magnocellular nuclei, which act as delay lines
• Auditory map of space calibrated by reference to visual map in optic tectum
• Most insectivorous bats use echolocation, monitoring echoes of own cries to navigate and find insect prey
• In FM cry, sound freq alters to give broadband signal, suitable for target ranging and description - in CF cry, sound freq constant, useful for detecting relative velocity and for hunting in woodland
• As bat detects and then intercepts prey, rate of echolocation pulses increases dramatically
• Bat’s auditory system specialised to detect faint echoes that follow loud cries - most sensitive to freqs near that of cry - highly directional - shows reduced sensitivity to loud cry compared with flowing soft echo, both in ear and brain - bat auditory neurons quick to recover responsiveness following each sound
• Some neurons in inferior colliculus act as accurate time markers, signalling exact time of cry and then of echo
• In cerebral cortex, time-marking neurons provide inputs to neurons sensitive to particular cry-echo delays - in some bats, neurons arranged in organised map in which neuron’s location related to distance between bat and sound-reflecting target
• Bats that use CF cries have acoustic fovea in ears and large number of brain neurons dedicated to analyse echoes of CF cry
- In auditory cortex of CF bat are neurons that detect flying insects by responding to small modulations in freq of echoes from CF cries

32
Q

Hill et al. (2016)

A

• Neural circuits of the vertebrate retina integrate the response of retinal photoreceptors to excite and inhibit retinal ganglion cells - ganglion cell receptive fields may be excited or inhibited by light at the centre of the field, whereas light in the surround antagonises the effect of light in the centre
• Straight-through pathways (photoreceptor –> bipolar cell –> ganglion cell_ produce the centre (on- or off-centre) of a ganglion cell’s receptive field - lateral pathways through horizontal cells and amacrine cells produce the antagonistic surround
• Axons of ganglion cells make up the optic nerve, relaying visual information to several brain areas - the geniculostriate pathway projects to the lateral geniculate nucleus (LGN) and from there to the primary visual cortex
• Simple and complex cells in V1 respond to light or dark bars or edges oriented at particular angles
• Parallel pathways in the visual cortex convey info about diff aspects of a visual stim, such as details of visual form, movement, colour and binocular determination of object distance
- Colour vision depends on the ratio of activation of three classes of cone photoreceptors sensitive to diff wavelengths of light - retinal circuitry integrates colour contrasts based on red0green and blue-yellow opponencies

33
Q

Zupanc (2019)

A

• The info flow arising from the env of an animal is dramatically reduces as a result of sensory and central filters called releasing mechanisms - the Umwelt of an animals reflects only a tiny portion of the absolute physical env
• Releasing mechanisms act as sensory/central links between a stim originating from the env and the resultant behav - the component of the env that triggers a given behav = sign stim/if occurs in context of social communication, a releaser
• Ethologically relevant features of sign stim can be examined through use of dummies/models
• Models provide supernormal stim elicit a greater response from animal than natural object - exploited in human societies to make certain features used for communication more attractive
• R’ship between indv components of sign stim when evoking behav response described by 2 principles: 1) law of heterogenous summation states that 2+ separately effective stim properties additive in partial effects when combined with one another; 2) Gestalt principle applies to situs in which combined stim regime more effective than sum of parts
• Effectiveness of sign stim depends not only on features of stim, but also condition of recipient
• Besides having immediate releasing effect, sign stim may also produce LT changes in receiver’s motiv to generate corresponding behav patterns
• First model system based on ability of toads to recognise prey and predators and to respond with appropriate behav patterns - in dummy exps, such objects can be identified by using rectangular stripes - movement of rectangles in direction of long axis elicits responses resembling those observed under natural conditions toward prey - movements in direction of short axis results in responses exhibited toward predators
• Within certain limits, increase in length of long axis leads to greater response in the toad
• Major targets in toad’s brain of retinal ganglion cells are thalamic-pretectal area and optic tectum - electrical recordings from brain regions, as well as ganglion cells, revealed poor sensitivity of retinal ganglion cells to worm and anti-worm like stim configurations - in both thalamic-pretectal area and tectum, there are popns of neurons that are activated by more complex stim configurations - one cell type called TP3 in thalamic-pretectal area, best activated by anti-worm like stim - cell type termed T5(2) in tectum activated by worm-like stim
• Prey-catching behav released by electrical stim of optic tectum, whereas stim of thalamic-pretectal region activates escape behav
• Configurational selectivity of T5(2) neurons in tectum depends on inhib input received from thalamic-pretectal area - lesioning of connection results in toad being unable to discrim between prey and predator objects
• Second model system centres around barn owls - use sense of hearing to localise prey - upon hearing noise, owl turns head in rapid flick so it faces source of sound
• Owl’s head-turning response can be monitored with electromagnetic angle-detector system consisting of 1) search coils mounted on top of head and 2) induction coils between which bird positioned
• Flick of head initiated approx. 100ms after onset of sound within 1-2 degrees in azimuth and elevation
• Owl’s ability to localise source of sound based on analysis of interaural time diffs and interaural intensity diffs - ITD occur when sound source not directly in front of owl - define location of sound in azimuth - IID result of directional asymmetry of owl’s ears and facial ruff - determine elevation coordinate of sound source
• Sound intensity and timing encoded in each fibre of auditory nerve by variation of rate of firing and locking of Aps to particular phase angle of spectral component to which respective fibre tuned
• Each fibre of auditory nerve divides into 2 collaterals - one innervates magnocellular nucleus, while other enters angular nucleus - both nuclei are subdivisions of cochlear nucleus - neurons of angular are sensitive to intensity info, neurons of magnocellular process timing data
• Next processing station is laminar nucleus - receives input from both ipsilateral and contralateral magnocellular nuclei - main function is to compute and encode interaural time diffs - achieved by 1) axons of magnocellular nucleus serving as delay lines and 2) neurons of laminar nucleus functioning and coincidence detectors - timing info extracted not unambiguous due to existence of phase ambiguity at level of sensory processing
• IID computed in posterior lateral leminiscal nucleus, which receives input both from contralateral counterpart and from contralateral angular nucleus
• Timing and intensity info converge in lateral shell of central nucleus where neurons respond to both ITD and IID
• While phase ambiguity still persists in lateral shell, it’s overcome at next level of sensory processing, namely in external nucleus - in brain region, space-specific neurons respond only to acoustic stim originating from receptive field - neurons form neural map of auditory space - within map, regions near midpoint of face represented by sig more neurons than more lateral areas - leads to high res if sound comes from sources directly in front of owl
• Final step in sensory processing is formation of auditory-visual map achieved through projection of neurons in external nucleus to optic tectum - alignment of 2 sensory maps controlled during ontogeny by visual instruction of auditory spatial tuning of neurons in optic tectum
- Auditory-visual map projects onto motor map, stim of which induces head movements