Module 5 Flashcards
Spatial neglect
-it’s an attention deficit: the person cannot attend to info in space contralateral to brain damage.
-comes from damage to partietal lobe
-often following right hemisphere damage (right hemisphere is specialized for spatial processing)
Ex. Ask a patient to copy a star. Will completely ignore the left side
-not just vision, occurs with other sensory modalities: read only words on the right side, eat one side of the plate, can only describe half of imagination and memories
What do the networks in prefrontal cortex and parietal cortical regions do for attention
Will help to orient processes around place of the brain needed. Directs processing for attended-to task. Controls which portions of the brain are paying attention
What are intraparietal sulcus (IPS) + frontal eye field (FEF) for
Top-down attention. Preparing to focus on something. For example, focusing on lecture
What are Temporo-parietal junction (TPJ) and ventro frontal cortex (VFC) for
Bottom up attentional orienting. Exogenous attention. Like when something grabs your attention (dog barks)
Change blindness
The failure to detect changes in stimuli in an attended zone
Broadbent’s early selection filter model
We filter at the level of perception, before information is processed for meaning. We block almost all surrounding info out. Only info that is physically different (louder) than the attended to info gets noticed
So information is hold in the sensory buffer for a very short amount of time and then we select what information will be further processed for meaning.
Then study showed that switch to listen to unattended info if they hear their names. This introduced the late filter models: filter is after meaning
Problems with early selection filter models
Looks like we can respond even to unattended ear.
Ex.Participants presented with a word paired with an electric shock. When hear shocked word in unattended ear, increased skin conductance.
Ex 2. At a party you can attend to on conversation but still hear your name in a non-attended to conversation.
Meaning: something must be processed
Attenuator model of attention (Treisman)
Compromise between early and late selection models. Some aspects of unattended material is processed for meaning. Filter is not an all or none.
Think of a filter that allows some in, and blocks some. Unattended stimuli is processed but at a reduced level relative to attended stimuli
High priority word/expected items
Late selection filter model
We process input to the level of the meaning and then select what we want to process further based on relevance to the task. Does it fit semantically??
Ex. In stroop task or identifying the color of the ink when its written a name of a color too. You process whats written first (attended info) and youre tended to say that color but then you put your attention on the ink color (unattended info) and and you can say it. Takes more time
Stroop effect
Delay in reaction time between congruent and incongruent stimulis
Ex. Mixmatch between name of color and the ink that is used. When ask to name the color of the word it takes longer and is more prone to errors when the color of the ink does not match the name of the color
The load theory
Attentional filtering can occur at different points depending on how much resources are required for your currently attended-to task
Low load: process non attended info to a later stage
High load: process non attended info only to an early stage
Ex:
-difficult task with a high load: we process at the level of perception, we dont need to process the meaning of every information that comes to us, we process early so that we focus our attention
-easy task with a low load: we process all info to the level of meaning. We have more place to process info, so we can process even irrelevant info for meaning
Ex. If you are doing a high load task, need a lot of concentration you have less resources to analyse whats going on outside of what youre doing. Less chance of seeing the elephant than in a low load condition
Multiple resource capacity
Attentional capacity is reached sooner if relevant and irrelevant info are from same modality.
Ex. Driving, need directions. Will have more problem paying attention if viewing directions on phone compared to listening to them. Because using same resource
Inattentional blindness
Not noticing something new. Something added. Like blue star in lecture slide. Or deer jumping in front of car.
Inhibition of return
Attention is inhibited from going to a recently attended space after a long duration between space cue and target
Cue presented for short amount of time: fast response when target is rlly there and longer response when target is on the other side
Cue presented for long amount of time: brain is like ok target isnt there lets look somewhere else. Real fast if target is on the opposite side. Slower response if the target is rlly on the side of the cue
Feature search
Search for an object that is different from the distractors based on one feature. Bottom-up and automatic processing.
Ex. Un point rouge parmi une tonne de points verts