Memory Flashcards
Sally Andrews lectures
What is cognitive psychology?
The internal process involved in making sense of the environment, and deciding what action might be appropriate.
What is the advantage of cognitive models?
o Cognitive models provide a mediating, functional level of description that helps integrate data and test hypotheses
What is a difficulty of cognitive psychology and how does it attempt to overcome this?
o Difficulty with cognitive psychology is that mental processes are not directly observable and that mental processes happen in the brain
o Draws on many other disciplines to provide metaphors and methods for measuring these unobservable process
Describe the main metaphor for information processing in the brain from the 1950’s-80’s
• From 1950’s-80’s: information processing
o Information processing was described using the computer metaphor, where the mind is a symbol processing system like a computer (psychologists got this metaphor from working with engineers and the development of computers: resulted in advancement in thinking about cognitive psychology and development in assessing intelligence)
What is information processing
Information processing- acquisition, storage and manipulation of symbols to meet task demands
What is the metaphor for information processing from the 1980’s to now?
• From 1980’s to now: connectionist framework
o Neural metaphor- the mind is a network of inter-connected processing units
o Processing consists of transmission of activation and inhibition within these networks
What developed from the 1990’s to now which provides insight into the brain mechanisms underpinning cognitive processes?
• From 1990’s to now: Cognitive neuroscience
o Neuroimaging: Many cognitive functions can be localised to particular neural regions
o Identifying and investigating how these areas respond to experimental manipulations provides insight into the brain mechanisms underpinning cognitive processes
Describe Weiseberg et al’s (2008) experiment and findings on the seductive allure of neuroscience
• Weisberg et al. (2008)
o Handed people good explanations of the fact that people overestimate how many people know a fact if they themselves know a fact with and without neuroscience terms, as well as bad explanations with or without neuroscience terms
o Presented these explanations to several groups of people:
Novices- undergraduates not studying cognitive neuroscience
• Novices rated bad neuroscience explanation of a better quality than bad without neuroscience explanation
Early cognitive neuroscience students
• Rated neuroscience explanations, good or bad, higher than without neuroscience explanations
Experts- graduates and academics working in neuroscience
• Rated neuroscience explanations poorer
o Found that both novices and early students could discriminate between good and bad explanations, but were more inclined to believe neuroscience ones
o Experts rejected irrelevant neuroscience
Describe Keehner et al’s findings on brain images and credibility
o 3D images of brain increased ratings of scientific credibility of explanations paired with different types of images
Describe Fernandez-Duque et al’s experiment and findings on neuroscience, on brain images and credibility
o Gave bad and good explanations to undergraduate students with neuroscience information, no neuroscience information or added a brain image
o Neuroscience information increased judged quality of both good and bad interpretations, even when compared to other types of scientific information
o No additional effect of including brain image
Describe Marr’s 1982 levels of description of complex systems
• Marr 1982
o Computational
What needs to be computed for the task to be carried out
o Conceptual: Representation and algorithm
The form in which information is represented and the steps or procedures that occur to transform inputs into outputs
o Hardware
Physical means by which the representation and algorithm are implemented
• Can look at cognitive processes even without understanding of the hardware
What are the 4 broad methods used in measuring human cognition?
- Experimental cognitive psychology
- Cognitive neuropsychology
- Computational modelling
- Cognitive neuroscience
What is the aim and process of experimental cognitive psychology
Aim:
Cognitive psychology: develop theories that will explain behaviour
-Aims to understand human cognition by observing the behaviour of people performing various cognitive tasks
Process:
- Develop theories of cognitive processes underlying a task
- Use behavioural evidence to test theories
What are the limitations of experimental cognitive psychology
1.However, theories are often abstract and hard to empirically test
- Tests rely on inferences (indirect evidence)
- –E.g. Face inversion effect experiments - Can lack ecological validity (people’s behaviours differ in real life from that of the lab)
- Paradigm specificity: findings obtained from any given experimental task are sometimes specific to that paradigm and do not generalise to other apparently similar tasks
- Most cognitive tasks are complex and involve many different processes
What are the strengths of experimental cognitive psychology?
- Source of most of the theories and tasks used by other approaches
- Can be applied to any aspect of cognition
- Has strongly influenced social, clinical and developmental psychology (has explained how we recognise faces-: we recognise faces through holistic processing and don’t analyse feature by feature) (Thompson 1980)
- Enormous influence on the development of cognitive tasks and on task analysis
What is the aim and process of cognitive neuropsychology?
Aim:
This approach involves studying brain-damaged patients to understand normal human cognition.
Process:
-Use patterns of impairment after brain injury to infer the functional organisation of the brain –> dissociations between different tasks implies that they rely on different neural systems (especially if double dissociations)
What are the limitations of cognitive neuropsychology?
- Excessive reliance on single-case studies
- Assumes isomorphism between physical and functional brain organisation
- Assumes modulatory and domain specificity (parts operate independently from each other and are function specific)
- Assumes uniformity of functional architecture across people
- Assumes subtractivity (brain damage impairs one or more processing modules but does not add or change anything)
- Minimises the interconnectedness of cognitive processes
What are the strengths of cognitive neuropsychology?
- Causal links can be shown between brain damage and cognitive performance
- ——Duchaine et al. (2006)
- ———-Upright face shown briefly, then briefly shown two alternative faces and have to decide whether one face matches the first upright face shown
- ———-Same thing down in inverted
- ———-People usually recognise more accurately upright faces than inverted ones, but people with proposopagnosia (difficulty in recognising faces) fail to show face inversion effect and have poor performance in upright and inverted faces
2.It has revealed unexpected complexities in cognition
What is the aim and process of computational modelling?
Aim:
Tries to instantiate cognitive models in computer programs that can be used to predict behaviour
Process:
- Create a computer program based on model of task performance which requires precise specification of all details of the model
- Run computer program and compare computer performance with people’s performance
What are the limitations of computational modelling?
- But often have to specify details that are not part of theory
- Fact that task can be done that way does not mean it is how people do it
- Human cognition is influenced by hard to replicate emotional and motivational factors
What are the strengths of computational modelling?
1.Forces theory to be detailed so that descriptors are accounted for
What are the aims and processes of cognitive neuroscience?
Aim:
Tries to find evidence of cognitive processes assumed in cognitive theories by looking at brain activity
Process:
- Uses information about behaviour and the brain to understand human cognition - Take snapshots of brain activity while people are performing cognitive tasks using different brain imaging technologies - Seems to provide direct measure of brain regions underlying performance
What are the limitations of cognitive neuroscience?
- But different measures reflect different aspects of brain function, and techniques require effective application of cognitive psychological methods
- But methods can also be invasive or have poor spatial/temporal accuracy
- Difficult to bridge the divide between psychological processes and concepts on the one hand and patterns of brain activation on the other
- False positive are common
- Most brain-imaging techniques reveal only associations between patterns of brain activation and behaviour
- Problems of ecological validity and paradigm specifity
What are the strengths of cognitive neuroscience?
- Has helped resolve theoretical controversises that have remained intractable with purely behavioural studies, for example solving speech perception: Wild et al. (2012) found that there was more activity in primary auditory cortex when visual input matched auditory input than when it did not, suggesting that knowledge of what was being presented directly affected basic auditory processes
- Has identified an increasing number of major networks: relies less on assumption of functional specialisation
- Great variety of techniques offering excellent temporal or spatial resolution
What are the 3 types of memory?
- Sensory memory
- Short term memory
- Long term memory
What is sensory memory and what subtypes does it have?
brief literal record of event which decays very quickly: iconic memory (holds visual information for 500 milliseconds) and echoic memory (holds auditory information for approximately 2 seconds)
Describe Sperling’s 1960 paradigm for iconic memory and its results
o Sperling 1960- iconic memory paradigm with 12 letters in 3 lines shown for 100 milliseconds
Found that on average, people can recall about 4.5 letters in full report procedure (report all the letters)
Found that on average, people can recall about 3-4 letters in partial report procedure (report only one row chosen after letters shown)
What is short-term memory?
Buffer for temporary maintenance of information
o Only a portion of information in sensory memory makes it into short-term memory
What are the 3 types of long term memory and what are they?
• Declarative memory/explicit memory
o Semantic memory
Abstract and semantic: memory is independent of how/when/whether the memory was stored
Stores facts and concepts
o Episodic memory
Stores autobiographical events: specific memory tied to particular time and place (tagged with location, content and temporal features)
Linked to the self
o Procedural memory
Memory of how to do things: often unable to be verbalised or consciously accessed
Automatic behaviours
-Classical conditioning effects
-Priming (implicit activation of concepts in long-term memory)
Compare the capacity of short term memory and long term memory
Short-term memory:
Limited (7+-2 items)
Long term memory:
unlimited
Compare the rate of forgetting of short term memory vs long term memory
Short term memory:
Decays within 30 seconds if not rehearsed
Long term memory:
Forgetting/inability to retrieve due to interference from other items or difficulties in retrieving information rather than decay
Compare the type of code used for short term memory vs long term memory
Short term memory:
Phonological
Long term memory:
Semantic
Who and when was the multistore model developed by?
Atkinson and Shiffrin in 1968
What is Atkinson and Shiffrin’s 1968 multistore model
o Traditional view of memory
o Structural view of memory- short-term and long-term memory rely on separate memory systems (and different neural systems) with different properties
o Sensory memory, short-term memory and long-term memory are separate systems ordered as above
Information comes in to the sensory system, where some of it decays
If attention is paid to sensory memory information, it goes into short-term memory
• If not rehearsed, short-term memory is lost through displacement
If short-term memory is rehearsed, it is stored in long-term memory where it can be lost through interference
What are the limitations of the multi-store model of memory?
Processing is not entirely bottom-up (does not necessarily run strictly and independently from sensory stores to short-term memory to long-term memory)
• There are semantic influences of context from long-term memory which can influence performance on short-term memory tasks
• Short-term memory is sensitive to semantic information
• Memory is a function of type of processing, not where it is stored
Wrongly assumes that attention occurs only after information is held in the sensory stores
Assumes that all items have equal importance, which is incorrect: the item currently receiving attention can be accessed more rapidly than the other items in short-term memory
Implies that only information processed consciously can be stored in long-term memory, which contradicts implicit learning
What is the release from proactive interference paradigm (Brown-Peterson 1959) and what does it show?
o Release from proactive interference paradigm (Brown-Peterson 1959)
Short-term memory task
• People presented with 3 items
• Count backwards by 3s
• Recall in correct order
Proactive interference (next two short-term memory task trials)
• Same short-term memory task involving word from same item category as first task
• Recall deteriorates across successive trials from the same category due to semantic relationships between items of successive trials
Release from proactive interference (4th trial)
• Recall improves when items from a different semantic category are presented
• Extent of improvement depends on similarity of semantic categories -the less similar the semantic category from the previous trials, the more improvement
• Provides evidence that semantic information in long-term memory influences short-term memory performance
What is evidence that Memory is a function of type of processing, not where it is stored
o Levels of processing (Craik and Lockhart 1972)
Investigated incidental memory for material presented with different orienting tasks with words to manipulate level of processing
• Print based task
• Sound based task
• Meaning based task
Memory was best when encoding task required meaning-based processing rather than sound or print based processing that was not due simply to longer encoding time for deeper encoding tasks
What is evidence that there are semantic influences of context from long-term memory which can influence performance on short-term memory tasks
o Example- sentence recall (can remember more than 7+-2 items when meaningful information holds items together)
Semantic information is influencing performance from short-term memory tasks
Proactive interference paradigm
Describe the adaptive control of thought model, and who and when was it developed by?
o Interactive memory architecture
o Replaced the multi-store model
o Increased interaction between declarative memory, production memory and working memory
o Working memory -
The system in which incoming information is processed and integrated with retrieved existing declarative and procedural memories
Not just a storage location (as opposed to its role in multi-store model)
Role in early stages of encoding memory
o Long term-memory
Declarative memory
• Semantic memory
• Episodic memory
Procedural memory
What are letter span tasks, what do they show and what are the usual results and influences on these results?
• Remember a series of shown letters
• Show that we rely on speech-based code for rehearsal in long term memory as fewer items remembered when have to count out loud and if letters sound the same
• People normally remember 7+-2 individual items
o If people remember under 5, it is a sign of neuropsychological damage
• If words and not individual letters, can remember more letters because one word is one time
o If words related in meaning, would remember better
o Location of list would influence short term memory
o Length of words would influence short term memory
Describe the components of Baddeley’s original 1974 working memory model
o Rather than a passive short-term store, working memory consists of multiple specialist components
- Central executive: responsible for memory processes
- Phonological loop
- Visuospatial sketchpad
Describe features of the central executive
• Limited capacity, modality free control system responsible for co-ordination, selection, allocation of attention resources etc.
Describe the phonological loop
• Phonological loop-maintains verbal information in a speech-based code with limited information capacity. Primary mechanism is rehearsal. Consists of 2 components:
o Passive phonological store directly concerned with speech perception
o Articulatory process linked to speech production, giving access to the phonological store
Describe the visuospatial sketchpad
• Visuospatial sketchpad- maintains temporary visual/spatial information using spatial/visual rehearsal processes. Consists of 2 components
o Visual cache which stores information about visual form and colour
o Inner scribe which processes spatial and movement information
Why is Baddeley’s working memory model useful for diagnosis of specific problems?
o Model used for diagnosing source of memory problems (by distinguishing central executive difficulties from specific problems in phonological processing…)
How was Baddeley’s original 1974 model of working memory limited?
o Model limited in that:
Central executive is poorly specified
• Mainly tested through dual task decrement: poorer performance when required to do two tasks at once
o Evidence shows that decrements depend on similarity of task demands, but does not explain what central executive does or the factors that limits its capacity
Does not explain the contents of working memory, how it’s organised, how it’s uploaded, updated and refined
Working memory integrates new input with long-term memory, but original model had no way of achieving this
Described the changes Baddeley made to his original working memory model in 2000-2012
o Specified relationship between working memory and long-term memory
o Added an episodic buffer to try to explain how content of working memory is determined
o More recent versions link to phonological loop and visuospatial sketchpad to the episodic buffer
What are executive functions and what brain part are they associated with and how do we know this?
o Executive processes control and regulate thought and action and are associated with frontal lobes
o People with brain injury to these areas have deficits identified with neuropsychological tests with Wisconsin card sort but these are very impure measures
What is the Wisconsin card sort?
Wisconsin card sort- array of cards are given to test subject that vary in shape and colour and are asked to compare their similarity with test card by either shape or colour. The category is changed midway through the test to test how well people can adapt to rule changes
What are the 3 independent executive processes and who identified these?
o Miyake identified 3 independent executive processes
- Updating
- Shifting
- Inhibition
What is the updating executive process used for
• Used to monitor and engage in rapid addition or deletion of working memory contents
What is the shifting executive process used for
• Used to switch flexibly between tasks or mental sets
What is the inhibition executive process used for
• Used to deliberately override dominant responses and to resist distraction
How is the updating executive function assessed
• Assessed by letter memory task: presented with letters at a rate of 1 letter per second and always need to remember last 3 letters
How is the shifting executive function assessed
• Assessed by colour-shape task: need to classify each card target by colour or shape depending on updated instructions
How is the inhibition executive function assessed
• Assessed by antisaccade task: report the arrow direction presented on the nonflashed side
Describe the unity and diversity framework of executive functions Miyake and Friedman found in 2012, and how they did so
Miyake and Friedman 2012 did a factor analysis of data from 3 types of updating, shifting and inhibition tasks using a large twin sample
Showed that updating, shifting and inhibition all related to a common executive function factor involved in maintaining task goals and filtering relevant/irrelevant information
Showed that updating and shifting also make separate independent contributions, especially shifting as relating to cognitive flexibility
Showed that there were substantial, independent genetic contributions to common factors and specific executive function factors
What is the episodic buffer?
Episodic buffer integrates information from phonological loop and visuospatial sketchpad input. It acts as a buffer between the other components of the working memory system and also links working memory to perception and long-term memory.
Has a capacity of 4 chunks
It is a storage facility for information from the phonological loop and visuo-spatial sketchpad
Describe the unitary memory model
• Unitary method (Cowan 2010)
o Unitary memory view: working memory consists of temporary activations through focus of attentional resources of long-term memory representations or of representations of items that were recently perceived
According to the unitary memory model, why do some amnesic patients have intact short term memory but bad long-term memory?
o Argues that amnesic patients that have intact short term memory but bad long-term memory have special problems in forming novel relations in both short-term and long-term memory
Do findings of neuroimaging studies support the unitary memory model?
o However, findings of neuroimaging studies are not supportive of the unitary-store approach: there is little evidence of hippocampal involvement when attempts are made to separate out short-term and long-term memory processes