Cognitin in clinical context Flashcards
cognition
“refers to all processes by which the sensory input is transformed, reduced, elaborated, stored, recovered and used” Neisser 1967
used in everything humans do
information processing metaphor
looking at brain doesnt reveal its functions, needs a metaphor
processes between presentation of stimulus and response
information -> input -> processor -> storage -> processor -> output -> information
whys scientific metaphor needed for cogition
scientific theories aim to be refutable
can’t look inside mind to see cognition
cognitive psychology relies on analogy
models of cognition evaluated against data
methodological approaches
- experimental psychology: healthy humans tested, large sample sizes, small budget
- computational modelling: computer stimulations of cognitive processes, parallel distributed processing
- cognitive neuropsychology: consequences of brian damage, associations and dissociations
- cognitive neuroscience: brain implements cognitive functions using neuroimaging technology
top down processing
processing influenced by individual goals, expectations, desires, plans, intentions rather than stimulus by itself
bottom down processing
processing directly influenced by stimulus, input proceeds through series of processing stages until required output is produced
perception
refers to ability to extract meaning from sensory input
includes audition, taste, touch, olfaction but is dominated by vision
vision alone accounts for over 50% of neurons in cortex
perception is constructive process
visual system
image -> retina -> cognitive system constructs perception
distal stimulus-object “out there” in environment
proximal stimulus-info registered on sensory receptors
misleading impression of simplicity
tootel 1982-support notion that near perfect representation of eternal world is projected onto primary visual cortex
monkeys unconscious, eyes pinned open, stimulus displayed for 25 mins to eyes, eyes in glucose solution, activity of brain detected, image in visual cortex, not what stimulus was
processing streams
ventral-‘what’ pathway, facts/objects you see
dorsal-‘where’ pathway, sensory inputs/location
-ventral stream; visual memory, specialised for object perception and recognition, determines what you’re looking at, V1 to temporal lobe
-dorsal stream; no visual memory, determines where object is using spatial configurations between objects, V1 to posterior regions of parietal lobe
-no colour at peripheral vision, brain assumes it
object recognition; 3 stage model
- image->local features ->shape representation ->object recognition
- image -> edge detection/contrast -> gestalt principles/feature integration -> stored representation/knowledge
- if object recognition goes wrong can tell when
gestalt principles
- whole visual perception more than sum of parts
- perceptual system try impose organisation on inputs
- components of image grouped together on basis of certain visual properties
- laws ‘good continuation’ and ‘closure’ = illusory contours
- see patterns rather than random arrangement
- proximity/similarity/continuation/closure
shape perception
- primarily bottom up processes produce primitive sketch
- sketch contains primitives eg edges/orientations/positions
- top down processes used to group collections of primitives into “lines, curves, blobs, groups and small patches” known as symbolic primitves
object recognition 3 models
- template matching
- feature analysis
- recognition
how do we do facial recognition so well
- challenge of individuating faces made apparent by fact they share basic configuration
- first order relations: two eyes, nose, mouth
- features are ample for rendering percept of face, but inadequate in rendering percept of ‘that’ face
- some instances where features are distinctive and accurate in signalling identity of individual
- second order relational properties
early model of facial recognition-modular model
Bruce and Young 1986
modular model with sub functions whihc are processed independently
distinctive pathways: recognising familiar faces vs recognising expressions
parallel pathways: facial expression, facial speech, visually derived semantic information
familiar face recognition-fru&pins
serial process
- face recognition units-stored descriptions of previously encountered faces, seen it before?
- personal identity nodes-identity specific info, their job, where they are from
- name generation-input from PINs generate name of identified individual
other routes for face recognition
parallel
expression analysis-computing facial emotion as happy, fearful etc
-facial speech analysis-lip reading regardless of hearing loss or not
-directed visual processing-when looking for information from a face
early evidence for bruce and young model-memory loss diary
memory loss diary study
-person not recognised
-feeling familiarity without identity
-person recognised, name not retrieved
-person misidentified
-repetition priming-recognise it quicker next time
familiarity not influence gender decision, expression analysis
humans selectively attend to identity or emotion in sorting tasks
brain support for Bruce and Young model
parallism: double dissociation between processing of facial expression and face recognition, some have deficits in identity but not expression and vice versa
double dissociation: two related processes function independently
different cortical sites active in processing of identity vs emotion, lateral fusiform gyri and inferior occipital
challenge of semantic priming
semantic-meaning/content/facts
face responded to faster if follows a closer related face than unrelated face
response to target is faster when proceeded by semantically related prime
interactive activation and competition (IAC model)
-built in as basic models/processes
-semantic information/knowledge is ‘pooled’
relationships between different bits of knowledge are represented in connections between the pools
-connections wihtin pools are mutually inhibitory
-connections between pools are mutually facilitatory
-if info from a pool is recollected, it inhibits any other info from pool being remembered but info from other pools that is linked is remembered
face selective neurons
- signal face familiarity, PINs are modality free gateways to semantic info
- no separate node for names; part of semantic information
- inferior temporal cortex-only neurons in cortex fire when hands or face are seen, but dont fire when fruit or genitalia are seen
- superior temporal sulcus used in social interactions, social perception and speech
logic of gnostic units
cells in inferior temporal cortex are selective to complex stimuli, giving credence to hierarchal theories of object perception
- early visual cortex codes elementary features such as line orientation and colour
- cells at highest level in hierarchy code specifically to shapes such as hands or faces
grandmother cells
hypothetical neuron
idea that single neuron will detect and represent a single object
linked to sparseness in neural coding
early stages object codes are distributed broadly, later stages become selective for combinations with code becoming more sparse
-ultimate spareness leads to single neurone mapping single object
what is agnosia
when object recognition fails
damage to occipital or inferior temporal cortex
different types of agnosia reveal clues about processes involved with object recognition
visual agnosia
unable to recognise everyday objects
normal visual acuity, memory, language function and intelligence
move around without bumping into things
reach and pick up objects they dont recognise
apperceptive agnosia
unable to recognise objects due to inability to percieve form
move about and negotiate obstacles without difficulty
low level binding of feature is absent
unable to perform basic copying and matching tasks
in object recognition model it occurs at stage 2: shape representation
associative agnosia
copying and matching skills unimpaired
unable to name object even with intact perception of it
could draw it but still be unable to recognise
involves failure in accessing knowledge about object
in object recognition model it occurs at stage 3: object representation
what is prosopagnosia
profound loss in ability to recognise faces
due to damage in right inferior temporal, ventral stream
unable to recognise familiar faces via visual input, recognition by other modalities; such as being identified by voice remain
face blindness
covert recognition
looked at skin conductance in response to a face
peak amplitudes to familiar faces compared to unfamiliar faces
similar pattern for prosopagnosia patients
capgras
recognition without feeling
recognise a face but not identity of individual
delusion that friend, spouse or close family member has been replaced by imposter that looks like them
can recognise someone by voice or characteristic behaviour
damage to dorsal stream
delusional misidentification
mulit-sensory perception
process where information from each sense is brought together
advantages: efficiency, establihs single coherent perspective of world
vision affects sound, colour influences taste and sound influences hardness
McGurk illusion
what you see clashes with what you hear
‘ba’ presented to ears, ‘Ga’ presented to eyes, subject perceives ‘da’
fmri shows looking at moving lips activate auditory part of brain
what is synaesthesia
neurological, automatic process
one sensory/cognitive pathway is stimulated leading to involuntary experiences in second sensory/cognitive pathway
concrete perception experiences elicited by stimuli in external environment or by internal thoughts
each person has different experiences of it
developmental synaesthesia
- genetics; runs in family
- present throughout life time
- equally present in males and females
- triggered by linguistic stimuli
- some regions of brain more connected in some people than others
- exuberant connectivity across brain, not just in regions related to synaesthesia
acquired synaesthesia
sensory deprivation
pharmacologically triggered eg LSD
effects temporary
how does synaesthesia link to normal cognition
links between vision and touch
blakemore 2005- watch someone being touched activates own somatosensory cortex, but when watching object being touched there’s no activation
number - space synaesthesia
see numbers in spatial array
small numbers on left, big numbers on right
faster responding to small numbers with left hand and faster responding to big numbers with right hand
culturally independent
what is attention
“taking possession from mind, in clear and vivid form, one out of what seem several simultaneously possible objects or trains of thought. Focualisation, concentration of consciousness.
attention as a process
selective attention: ability to preferentially process a subset of all available information
sustained attention: ability to maintain state of high alertness/arousal/vigilance
attention as resource
set of limited resources for cognitive processing
divided attention: ability to distribute attention over range of competing inputs
selective auditory attention
shadowing/dichotic listening
cocktail party phenomenon
cherry 1953-participants say very little about non-shadowed ear, participants unable to remember contents of message or tell if language changed, participants could tell if voice was male or female and detect sudden tone change
early perception model
perception -> sensory buffer -> selective filter -> limited capacity processor
filter selects information on basis of it’s gross physical properties
evidence that info beyond physical is processed:
-SCRs to words associated with shock despite not hearing word
-shadow meaningful words if channel switched
-debate over whether selection occurs early or late in processing stream
-triesman attenuation model, name is heard in attenuated channels (cocktail effect)
attentuation model
perception -> sensory buffer -> attenuator -> attentuated channel/selected channel -> semantic analysis
selective visual attention
- selectively process subset of visual input
- small area of retina capable of processing visual info with high degree acuity
- compensated for by moving eyes 2-3 times a second
- eye movement adn attention linked
- phenomenon of attentional blindness reveals how little info taken in from surroundings
- visual attention explored with parallel processing
parallel vs serial processing
parallel searches have flat, set size functions and will always spot red dot same time regardless of how many green dots around it
serial search have positive set size function
FEATURE INTEGRATION
visual search
basic feature analysis: colour/orientation/intensity, occur in parallel, targets defined by single feature will pop out instantly
figure integration: attention is visual glue, allows different features to be combined to form coherent percept
conjoint searches: positive set size function, each stimulus processed one at time in order to bind together
attention and automacy
what happens when attention not required
automacy results from extensive practice eg reading or driving familair routes
source of action slips eg taking unusual route home
automatic processing in inevitable and once activated runs to completion
divided attention
refers to doing more than one thing at a time
three factors influence extent to which two tasks can be successfully carried out simultaneously, how similar, how practised, how difficult task
never really multitasking as one task always have decreased performance
anxiety
trait anxiety: individuals differences related to tendency to present anxiety
state anxiety: psychological and physiological reactions directly to situations in moment
emotional stroop test
name colour that threat related word is written in
increased anxiety found in those with PTSD, panic disorder, OCD, social phobia, specific phobia
visual attention tasks
dot probe task
words and pictures
visual search
attention to faces in anxiety
- Bradley, Mogg, Millar 2000
- dot probe task to look at attention to different levels of face
- student sample
- self-reported state of anxiety/depression
- avoidance score for happy face decreased as state anxiety increased
- avoidance score for threat face increased as state anxiety increased
theoretical issues of biases
are biases unconditional?
is attentional bias cause or effect?
does anxiety cause attentional bias?
are attentional biases specific to anxiety
Purkis, Lester, Field 2011
- spider phobic participant found spider images distracting but not Dr who
- dr who expert didnt find spiders distracting but found DR who distracting
are attentional biases a cause
-training biases: probes consistently presented in location of threat or nonthreat items according to training groups, novel material items test for induced bias
after training avoidance for neutral threat was -5 and vigilance was 25
neuropsychology
“reverse engineering” of brain
infer cognitive functions b measuring effect on rest of system when it’s removed
neglect background
"failure to report, respond or orient to meaningful contralesional stimuli" patients often: -shave/make up one side of face -eat off one side of plate -read text on one side of page -bump into walls/ignore people on their left -unaware of difficulties "agnosognosia"
causes of neglect
stroke affecting right side of brain-parietal lobe
area most associated with neglect in sample of 20 was right side parietal lobe
tests for neglect
line bisection: mark midpoint, patients with neglect draw midpoint to the right side
picture copying: flower, patients with neglect only draw right half of flower
cancellation: circle all symbol, patients with neglect not circle symbol on left side
recovery from neglect
usually recovers spontaneously within a few weeks/months
self portrait drawings at progressive stages after stroke show recovery-draw more of left side each time
what frames of reference foes neglect operate on
egocentric reference frame; things to left of oneself
object centred reference frame; tells us how objects are processed in absence of awareness
intrinsic axis; neglect patients extract dominant axis and ignore whats on left even if not egocentric left
task: identify the gap in line image, harder to identify if left of intrinsic axis
blind sight in neglect
task: picture of house, picture of same house on fire
asked if pictures same or different, asked which house preferred
-if have neglect say pictures the same, but preferred house not on fire
-suggests unconscious processing can influence action/decision making
does neglect affect ‘mental representaton’
patients asked to draw object from memory
if have neglect tend to draw only half the numbers on clock
neglect and visual imagery
Bisiachs Milan Square experiment Italian participants asked t imagine square asked whats on the left if suffer from neglect, won't remember what was on left of square
extinction
mild form of neglect
only occurs when two or more objects presented at same time
therapist hold up two hands, if one hand is fist and other has fingers up patient can see both hands
if therapist holds up exactly same number fingers neglect patient cant see hand on left
Rees 2000
patient correctly scores 58/60 when presented in left visual field alone
patient scores 2/60 when presented in left visual field aswell as stimulus in right visual field
extinguished items still activated visual cortex despite lack of awareness
neglect is impairment of attention not perception
episodic memory
reference is to oneself organised temporally events recalled 'consciously' susceptible to forgetting context dependent
semantic memory
reference to knowledge only not organised temporally events are known relatively permanent context independent
evidence for episodic-semantic distinction
interdependence of systems makes distinction unclear
some evidence from different pathologies eg semantic dementia
evidence from semantic dementia
neuropathology: non alzheimer type degenerative pathology of polar and inferolateral temporal cortex, sparing of hippocampus in early stages
symptoms: progressive selective deterioration in semantic memory, reasonably preserved episodic memory
knowledge and language
- informs many aspects of language processing
- much more than a store of word meanings or grammar rules
- we read or listen to spoken language, use knowledge to make inferences
- ability to make correct inferences underlies language comprehension
structure of semantic memory
feature comparison model
prototype model
exemplar model
network model
feature comparison model
concepts are stored as lists of defining or characteristic features
sentence verification task: reaction time taken to verify sentence, is carrot veg, is artichoke veg? quicker to answer carrot question
limitations: few concepts can be reduced to list of defining characteristics, features not independent
prototype model
category membership not clear cut
define ‘centre’ of category not the boundary
prototypicality effect: defining characteristic
categories have graded structure, some members more prototypical than others
key claims: differ from non prototypes, examples of category most often generated are those rated most prototypical, prototypes share most features with other category members
objects categorised at 3 levels; subordinate, basic, subordinate
prototypes change with context, lose information