Memory Flashcards
Sally Andrews lectures
What is cognitive psychology?
The internal process involved in making sense of the environment, and deciding what action might be appropriate.
What is the advantage of cognitive models?
o Cognitive models provide a mediating, functional level of description that helps integrate data and test hypotheses
What is a difficulty of cognitive psychology and how does it attempt to overcome this?
o Difficulty with cognitive psychology is that mental processes are not directly observable and that mental processes happen in the brain
o Draws on many other disciplines to provide metaphors and methods for measuring these unobservable process
Describe the main metaphor for information processing in the brain from the 1950’s-80’s
• From 1950’s-80’s: information processing
o Information processing was described using the computer metaphor, where the mind is a symbol processing system like a computer (psychologists got this metaphor from working with engineers and the development of computers: resulted in advancement in thinking about cognitive psychology and development in assessing intelligence)
What is information processing
Information processing- acquisition, storage and manipulation of symbols to meet task demands
What is the metaphor for information processing from the 1980’s to now?
• From 1980’s to now: connectionist framework
o Neural metaphor- the mind is a network of inter-connected processing units
o Processing consists of transmission of activation and inhibition within these networks
What developed from the 1990’s to now which provides insight into the brain mechanisms underpinning cognitive processes?
• From 1990’s to now: Cognitive neuroscience
o Neuroimaging: Many cognitive functions can be localised to particular neural regions
o Identifying and investigating how these areas respond to experimental manipulations provides insight into the brain mechanisms underpinning cognitive processes
Describe Weiseberg et al’s (2008) experiment and findings on the seductive allure of neuroscience
• Weisberg et al. (2008)
o Handed people good explanations of the fact that people overestimate how many people know a fact if they themselves know a fact with and without neuroscience terms, as well as bad explanations with or without neuroscience terms
o Presented these explanations to several groups of people:
Novices- undergraduates not studying cognitive neuroscience
• Novices rated bad neuroscience explanation of a better quality than bad without neuroscience explanation
Early cognitive neuroscience students
• Rated neuroscience explanations, good or bad, higher than without neuroscience explanations
Experts- graduates and academics working in neuroscience
• Rated neuroscience explanations poorer
o Found that both novices and early students could discriminate between good and bad explanations, but were more inclined to believe neuroscience ones
o Experts rejected irrelevant neuroscience
Describe Keehner et al’s findings on brain images and credibility
o 3D images of brain increased ratings of scientific credibility of explanations paired with different types of images
Describe Fernandez-Duque et al’s experiment and findings on neuroscience, on brain images and credibility
o Gave bad and good explanations to undergraduate students with neuroscience information, no neuroscience information or added a brain image
o Neuroscience information increased judged quality of both good and bad interpretations, even when compared to other types of scientific information
o No additional effect of including brain image
Describe Marr’s 1982 levels of description of complex systems
• Marr 1982
o Computational
What needs to be computed for the task to be carried out
o Conceptual: Representation and algorithm
The form in which information is represented and the steps or procedures that occur to transform inputs into outputs
o Hardware
Physical means by which the representation and algorithm are implemented
• Can look at cognitive processes even without understanding of the hardware
What are the 4 broad methods used in measuring human cognition?
- Experimental cognitive psychology
- Cognitive neuropsychology
- Computational modelling
- Cognitive neuroscience
What is the aim and process of experimental cognitive psychology
Aim:
Cognitive psychology: develop theories that will explain behaviour
-Aims to understand human cognition by observing the behaviour of people performing various cognitive tasks
Process:
- Develop theories of cognitive processes underlying a task
- Use behavioural evidence to test theories
What are the limitations of experimental cognitive psychology
1.However, theories are often abstract and hard to empirically test
- Tests rely on inferences (indirect evidence)
- –E.g. Face inversion effect experiments - Can lack ecological validity (people’s behaviours differ in real life from that of the lab)
- Paradigm specificity: findings obtained from any given experimental task are sometimes specific to that paradigm and do not generalise to other apparently similar tasks
- Most cognitive tasks are complex and involve many different processes
What are the strengths of experimental cognitive psychology?
- Source of most of the theories and tasks used by other approaches
- Can be applied to any aspect of cognition
- Has strongly influenced social, clinical and developmental psychology (has explained how we recognise faces-: we recognise faces through holistic processing and don’t analyse feature by feature) (Thompson 1980)
- Enormous influence on the development of cognitive tasks and on task analysis
What is the aim and process of cognitive neuropsychology?
Aim:
This approach involves studying brain-damaged patients to understand normal human cognition.
Process:
-Use patterns of impairment after brain injury to infer the functional organisation of the brain –> dissociations between different tasks implies that they rely on different neural systems (especially if double dissociations)
What are the limitations of cognitive neuropsychology?
- Excessive reliance on single-case studies
- Assumes isomorphism between physical and functional brain organisation
- Assumes modulatory and domain specificity (parts operate independently from each other and are function specific)
- Assumes uniformity of functional architecture across people
- Assumes subtractivity (brain damage impairs one or more processing modules but does not add or change anything)
- Minimises the interconnectedness of cognitive processes
What are the strengths of cognitive neuropsychology?
- Causal links can be shown between brain damage and cognitive performance
- ——Duchaine et al. (2006)
- ———-Upright face shown briefly, then briefly shown two alternative faces and have to decide whether one face matches the first upright face shown
- ———-Same thing down in inverted
- ———-People usually recognise more accurately upright faces than inverted ones, but people with proposopagnosia (difficulty in recognising faces) fail to show face inversion effect and have poor performance in upright and inverted faces
2.It has revealed unexpected complexities in cognition
What is the aim and process of computational modelling?
Aim:
Tries to instantiate cognitive models in computer programs that can be used to predict behaviour
Process:
- Create a computer program based on model of task performance which requires precise specification of all details of the model
- Run computer program and compare computer performance with people’s performance
What are the limitations of computational modelling?
- But often have to specify details that are not part of theory
- Fact that task can be done that way does not mean it is how people do it
- Human cognition is influenced by hard to replicate emotional and motivational factors
What are the strengths of computational modelling?
1.Forces theory to be detailed so that descriptors are accounted for
What are the aims and processes of cognitive neuroscience?
Aim:
Tries to find evidence of cognitive processes assumed in cognitive theories by looking at brain activity
Process:
- Uses information about behaviour and the brain to understand human cognition - Take snapshots of brain activity while people are performing cognitive tasks using different brain imaging technologies - Seems to provide direct measure of brain regions underlying performance
What are the limitations of cognitive neuroscience?
- But different measures reflect different aspects of brain function, and techniques require effective application of cognitive psychological methods
- But methods can also be invasive or have poor spatial/temporal accuracy
- Difficult to bridge the divide between psychological processes and concepts on the one hand and patterns of brain activation on the other
- False positive are common
- Most brain-imaging techniques reveal only associations between patterns of brain activation and behaviour
- Problems of ecological validity and paradigm specifity
What are the strengths of cognitive neuroscience?
- Has helped resolve theoretical controversises that have remained intractable with purely behavioural studies, for example solving speech perception: Wild et al. (2012) found that there was more activity in primary auditory cortex when visual input matched auditory input than when it did not, suggesting that knowledge of what was being presented directly affected basic auditory processes
- Has identified an increasing number of major networks: relies less on assumption of functional specialisation
- Great variety of techniques offering excellent temporal or spatial resolution
What are the 3 types of memory?
- Sensory memory
- Short term memory
- Long term memory
What is sensory memory and what subtypes does it have?
brief literal record of event which decays very quickly: iconic memory (holds visual information for 500 milliseconds) and echoic memory (holds auditory information for approximately 2 seconds)
Describe Sperling’s 1960 paradigm for iconic memory and its results
o Sperling 1960- iconic memory paradigm with 12 letters in 3 lines shown for 100 milliseconds
Found that on average, people can recall about 4.5 letters in full report procedure (report all the letters)
Found that on average, people can recall about 3-4 letters in partial report procedure (report only one row chosen after letters shown)
What is short-term memory?
Buffer for temporary maintenance of information
o Only a portion of information in sensory memory makes it into short-term memory
What are the 3 types of long term memory and what are they?
• Declarative memory/explicit memory
o Semantic memory
Abstract and semantic: memory is independent of how/when/whether the memory was stored
Stores facts and concepts
o Episodic memory
Stores autobiographical events: specific memory tied to particular time and place (tagged with location, content and temporal features)
Linked to the self
o Procedural memory
Memory of how to do things: often unable to be verbalised or consciously accessed
Automatic behaviours
-Classical conditioning effects
-Priming (implicit activation of concepts in long-term memory)
Compare the capacity of short term memory and long term memory
Short-term memory:
Limited (7+-2 items)
Long term memory:
unlimited
Compare the rate of forgetting of short term memory vs long term memory
Short term memory:
Decays within 30 seconds if not rehearsed
Long term memory:
Forgetting/inability to retrieve due to interference from other items or difficulties in retrieving information rather than decay
Compare the type of code used for short term memory vs long term memory
Short term memory:
Phonological
Long term memory:
Semantic
Who and when was the multistore model developed by?
Atkinson and Shiffrin in 1968
What is Atkinson and Shiffrin’s 1968 multistore model
o Traditional view of memory
o Structural view of memory- short-term and long-term memory rely on separate memory systems (and different neural systems) with different properties
o Sensory memory, short-term memory and long-term memory are separate systems ordered as above
Information comes in to the sensory system, where some of it decays
If attention is paid to sensory memory information, it goes into short-term memory
• If not rehearsed, short-term memory is lost through displacement
If short-term memory is rehearsed, it is stored in long-term memory where it can be lost through interference
What are the limitations of the multi-store model of memory?
Processing is not entirely bottom-up (does not necessarily run strictly and independently from sensory stores to short-term memory to long-term memory)
• There are semantic influences of context from long-term memory which can influence performance on short-term memory tasks
• Short-term memory is sensitive to semantic information
• Memory is a function of type of processing, not where it is stored
Wrongly assumes that attention occurs only after information is held in the sensory stores
Assumes that all items have equal importance, which is incorrect: the item currently receiving attention can be accessed more rapidly than the other items in short-term memory
Implies that only information processed consciously can be stored in long-term memory, which contradicts implicit learning
What is the release from proactive interference paradigm (Brown-Peterson 1959) and what does it show?
o Release from proactive interference paradigm (Brown-Peterson 1959)
Short-term memory task
• People presented with 3 items
• Count backwards by 3s
• Recall in correct order
Proactive interference (next two short-term memory task trials)
• Same short-term memory task involving word from same item category as first task
• Recall deteriorates across successive trials from the same category due to semantic relationships between items of successive trials
Release from proactive interference (4th trial)
• Recall improves when items from a different semantic category are presented
• Extent of improvement depends on similarity of semantic categories -the less similar the semantic category from the previous trials, the more improvement
• Provides evidence that semantic information in long-term memory influences short-term memory performance
What is evidence that Memory is a function of type of processing, not where it is stored
o Levels of processing (Craik and Lockhart 1972)
Investigated incidental memory for material presented with different orienting tasks with words to manipulate level of processing
• Print based task
• Sound based task
• Meaning based task
Memory was best when encoding task required meaning-based processing rather than sound or print based processing that was not due simply to longer encoding time for deeper encoding tasks
What is evidence that there are semantic influences of context from long-term memory which can influence performance on short-term memory tasks
o Example- sentence recall (can remember more than 7+-2 items when meaningful information holds items together)
Semantic information is influencing performance from short-term memory tasks
Proactive interference paradigm
Describe the adaptive control of thought model, and who and when was it developed by?
o Interactive memory architecture
o Replaced the multi-store model
o Increased interaction between declarative memory, production memory and working memory
o Working memory -
The system in which incoming information is processed and integrated with retrieved existing declarative and procedural memories
Not just a storage location (as opposed to its role in multi-store model)
Role in early stages of encoding memory
o Long term-memory
Declarative memory
• Semantic memory
• Episodic memory
Procedural memory
What are letter span tasks, what do they show and what are the usual results and influences on these results?
• Remember a series of shown letters
• Show that we rely on speech-based code for rehearsal in long term memory as fewer items remembered when have to count out loud and if letters sound the same
• People normally remember 7+-2 individual items
o If people remember under 5, it is a sign of neuropsychological damage
• If words and not individual letters, can remember more letters because one word is one time
o If words related in meaning, would remember better
o Location of list would influence short term memory
o Length of words would influence short term memory
Describe the components of Baddeley’s original 1974 working memory model
o Rather than a passive short-term store, working memory consists of multiple specialist components
- Central executive: responsible for memory processes
- Phonological loop
- Visuospatial sketchpad
Describe features of the central executive
• Limited capacity, modality free control system responsible for co-ordination, selection, allocation of attention resources etc.
Describe the phonological loop
• Phonological loop-maintains verbal information in a speech-based code with limited information capacity. Primary mechanism is rehearsal. Consists of 2 components:
o Passive phonological store directly concerned with speech perception
o Articulatory process linked to speech production, giving access to the phonological store
Describe the visuospatial sketchpad
• Visuospatial sketchpad- maintains temporary visual/spatial information using spatial/visual rehearsal processes. Consists of 2 components
o Visual cache which stores information about visual form and colour
o Inner scribe which processes spatial and movement information
Why is Baddeley’s working memory model useful for diagnosis of specific problems?
o Model used for diagnosing source of memory problems (by distinguishing central executive difficulties from specific problems in phonological processing…)
How was Baddeley’s original 1974 model of working memory limited?
o Model limited in that:
Central executive is poorly specified
• Mainly tested through dual task decrement: poorer performance when required to do two tasks at once
o Evidence shows that decrements depend on similarity of task demands, but does not explain what central executive does or the factors that limits its capacity
Does not explain the contents of working memory, how it’s organised, how it’s uploaded, updated and refined
Working memory integrates new input with long-term memory, but original model had no way of achieving this
Described the changes Baddeley made to his original working memory model in 2000-2012
o Specified relationship between working memory and long-term memory
o Added an episodic buffer to try to explain how content of working memory is determined
o More recent versions link to phonological loop and visuospatial sketchpad to the episodic buffer
What are executive functions and what brain part are they associated with and how do we know this?
o Executive processes control and regulate thought and action and are associated with frontal lobes
o People with brain injury to these areas have deficits identified with neuropsychological tests with Wisconsin card sort but these are very impure measures
What is the Wisconsin card sort?
Wisconsin card sort- array of cards are given to test subject that vary in shape and colour and are asked to compare their similarity with test card by either shape or colour. The category is changed midway through the test to test how well people can adapt to rule changes
What are the 3 independent executive processes and who identified these?
o Miyake identified 3 independent executive processes
- Updating
- Shifting
- Inhibition
What is the updating executive process used for
• Used to monitor and engage in rapid addition or deletion of working memory contents
What is the shifting executive process used for
• Used to switch flexibly between tasks or mental sets
What is the inhibition executive process used for
• Used to deliberately override dominant responses and to resist distraction
How is the updating executive function assessed
• Assessed by letter memory task: presented with letters at a rate of 1 letter per second and always need to remember last 3 letters
How is the shifting executive function assessed
• Assessed by colour-shape task: need to classify each card target by colour or shape depending on updated instructions
How is the inhibition executive function assessed
• Assessed by antisaccade task: report the arrow direction presented on the nonflashed side
Describe the unity and diversity framework of executive functions Miyake and Friedman found in 2012, and how they did so
Miyake and Friedman 2012 did a factor analysis of data from 3 types of updating, shifting and inhibition tasks using a large twin sample
Showed that updating, shifting and inhibition all related to a common executive function factor involved in maintaining task goals and filtering relevant/irrelevant information
Showed that updating and shifting also make separate independent contributions, especially shifting as relating to cognitive flexibility
Showed that there were substantial, independent genetic contributions to common factors and specific executive function factors
What is the episodic buffer?
Episodic buffer integrates information from phonological loop and visuospatial sketchpad input. It acts as a buffer between the other components of the working memory system and also links working memory to perception and long-term memory.
Has a capacity of 4 chunks
It is a storage facility for information from the phonological loop and visuo-spatial sketchpad
Describe the unitary memory model
• Unitary method (Cowan 2010)
o Unitary memory view: working memory consists of temporary activations through focus of attentional resources of long-term memory representations or of representations of items that were recently perceived
According to the unitary memory model, why do some amnesic patients have intact short term memory but bad long-term memory?
o Argues that amnesic patients that have intact short term memory but bad long-term memory have special problems in forming novel relations in both short-term and long-term memory
Do findings of neuroimaging studies support the unitary memory model?
o However, findings of neuroimaging studies are not supportive of the unitary-store approach: there is little evidence of hippocampal involvement when attempts are made to separate out short-term and long-term memory processes
Describe Ranganath et al’s 2003 experiment on memory
o Ranganath et al 2003
Measured neural mechanisms involved in long term memory vs working memory
Same tasks to differentiate different processes
Working memory task
• Delayed face recognition paradigm
Long-term memory task
• Sequence of 9 faces separated by intervals of 20 seconds or so
• After delay of 5-10 minutes, given recognition memory tasks
Strongest activations in encoding for both working memory and long term memory is in the posterior of the brain (sensory processing areas)
Strongest activations in recognition for both working memory and long-term memory is in frontal, posterior and temporal areas
Describe the theory that there is only a single store of memory content and its evidence
• Rather than multiple memory systems, perhaps there is a single store of memory content but the processes that operate on it differ as a function of tasks demands/effort required etc.
o The memory representations supporting both short term memory tasks and long term memory tasks are stored in (or accessed through) posterior brain regions involved in initial perception/encoding
o Frontal areas are involved in the attentional/executive processes required for maintenance, updating, shifting attention, resisting interference…
What are the advantages of the working memory model as a whole
- Model concerned with both active processing and transient information storage, so its scope is much greater
- Model explains the partial deficits of short-term memory observed in brain-damaged patients
- Incorporates verbal rehearsal as an optional process within the phonological loop, which is a lot more realistic than the enormous significant of rehearsal within the multi-store model
What are the limitations of the working memory model as a whole
- Only considers spatial, visual and phonological information
- Difficult to identify the number and nature of main executive processes associated with the central executive
What are psychometric approaches used for to explore working memory?
o Measuring the work done by working memory
o Aim: to get an index of working memory which allows you to assess differences between people in their working memory capacity so that it can be used to predict their performance in different tasks
How do you measure individual differences in working memory?
o To measure individual differences in working memory, need to assess overall capacity and not just separate components
What is working memory capacity?
o Working memory capacity- how much information an individual can process and store at the same time
What does a valid measure of working capacity in real world tasks need to assess and how is this assessed?
o A valid measure of working capacity in real world tasks need to assess demands of both storage and processing simultaneously as both of them constitute the demands on working memory capacity
Use complex span tasks as they require both processing and storage, and can be implemented in different domains
• Complex span tasks correlate more highly with real world measures of performance than simple span tasks
Daneman and Merikle 1996
Made subjects take standardised/none standardised tests as well as complex vs simple span tasks
• There are domain-general and domain-specific contributions in complex span tasks
What is evidence that complex span measures predict performance in real world tasks?
Daneman and Merikle 1996
Made subjects take standardised/none standardised tests as well as complex vs simple span tasks
• There are domain-general and domain-specific contributions in complex span tasks
• Sanchez and Wiley (2006)
o Individuals higher in working memory are less vulnerable to seductive details effect
Gave people texts accompanied by conceptual images related to the text or seductive images with little relevance to the text, or people with no images at all
Classified people as high or low working memory capacity through complex span tasks
People then wrote essays about ice age causes and answered questions about inferences based on material they’d read
o Found that low working memory participants showed poorer performance when text accompanied by seductive images, while high working memory participants were not distracted by them
• Kane et al.(2007)
o Individuals higher in working memory are less vulnerable to mind wandering and sustain attention in more demanding and challenging tasks whilst individuals lower in working memory tended to reduce attention
124 students tested on 3 complex span tasks to assess working memory capacity
Students were given palm pilots for 7 days which, 8 times a day, assessed mind wandering as a function of demands of current task
Rated their concentration and effort on the task they were doing, as well as its challenge
Showed that higher working memory capacity associated with more on-task thoughts/less mind wandering only when task more challenging and required more concentration and effort
What is the difference between people with high working memory capacity compared to those with low working memory capacity
High-capacity individuals are better at controlling external and internal distracting information and better at discriminating between more and less relevant information and inhibiting the processing of irrelevant information
Does performance in many working tasks reflect only pure working memory capacity?
Performance in many working tasks reflects both pure working memory capacity/executive process and the effects of strategies such as rehearsal and chunking
What is the estimate for working memory capacity when learning strategies are eliminated?
• When strategies are eliminated, estimates of working memory capacity are less than 7+/-2
o Pure working memory capacity and attention capacity can only hold 3-4 chunks across many tasks and modalities
Why are pure measures of working memory capacity/executive processing important?
Pure measures of working memory capacity/executive processing predicts developmental changes and individual differences in fluid intelligence
Describe Shipstead et al’s 2016 model of solving complex problem solving tasks
• Shipstead et al.2016-manner of solving complex problem solving tasks
o There is a top-down executive signal of attention/resources which is dependent on common executive functions (shifting, updating, inhibition)
o This influences maintenance of attention on relevant information (updating) or disengagement of irrelevant information (shifting), which is partially determined by the nature of the to-be-performed task
o Task provides an environmental medium around which cognitive processes are organised.
What is the difference between working memory and fluid intelligence?
• Shipstead et al 2016 argue that working memory and fluid intelligence are highly related, but differ in the relative importance of maintenance vs disengagement
o Working memory/complex span tasks dependent more on maintenance processes while problem solving tasks depend more on disengagement
Describe early memory research and traditional tests of memory
• Early memory research dominated by laboratory studies of explicit, declarative (episodic) memory in free recall and recognition tasks
• Traditional tests of memory
1. Study list presented with words and images
2. Delay and filler task
3. Test of memory
Recall task- write down as many study task contents you studied
• Free recall- producing list items in any order in the absence of any specific cues
• Serial recall- involves producing list items in the order they were presented
• Cued recall- involves producing list items in response to cues
Recognition task- present studied content mixed with equal number of matched new content: participants decide which ones they saw before
What factors affect encoding?
• Find that depth of processing affects encoding due to
o Level of processing effects
But this theory has limitations in that it underestimates the importance of the retrieval environment in determining memory performance
o Better memory when material is richer and longer (more adjectives) and elaborated at encoding
o Better memory for material that is self-relevant
What factors affect retrieval?
• Retrieval task and context
o Recognition is nearly always better than free recall as recall task does not provide any explicit cue
Recognition task provides a cue (the studied item) which can prime memory
Memory network can also be primed by the study context
• Godden and Badddeley (1975)
o Participants learned words either on land or 20 feet underwater and were asked to later recall the words on land or underwater
o Recall best when the contexts match but recognition unaffected by context
o Different retrieval tasks appear to probe different aspects of memory
What is the encoding specificity retrieval principle
o Encoding specificity principle-The probability of successful retrieval of the target item rises with rising amounts of information overlap between the information present at retrieval and the information stored in memory (Tulving)
Effectiveness of a cue depends on information overlap and the unique relationship between the cue and a piece of information
What is transfer appropriate processing
o Transfer appropriate processing (Morris, Bransford and Franks 1977)
Made participants learn words in semantic or rhyming context, then tested them with semantic and rhyming test
Memory performance usually best for deeply encoded items but retrieval is best when there is a match between the processes required at encoding and retrieval
When encoding focuses on meaning, transfer depends on the meaning of the sentence, not its literal form
Memory is dependent on the requirements of the memory test
Describe explicit memory tasks vs implicit memory tasks in what tasks are used, what it is and what it is influenced by
Explicit memory tasks
- Recall/recognition
- Subjects explicitly told to remember items from previous list intentional retrieval
- Memory performance shows level of processing and transfer-appropriate processing effects
Implicit memory tasks
- Fragment completion, stem completion, perceptual identification
- Not told to remember anything, just perform a task: cover task done to cover the fact that it is a memory task
- Compare performance for old and new items to infer memory
- Recently activated information more likely to be automatically retrieved in response to fragment cue
- Independent of deliberate, intentional retrieval
- Does not strongly show level of processing effects
Describe what parts of the brain declarative memory is controlled by
• Declarative memory/explicit memory
o Episodic memory- one’s own experiences
Hippocampus and neocortex dependent
o Semantic memory-Facts, general knowledge
Underlying entorhinal, perirhinal and parahippocampal cortices
o Both these memories are controlled by the medial temporal lobe and diencephalon
How do semantic memory and episodic memory interact
o Semantic memory can enhance learning on an episodic memory task whilst episodic memory can facilitate retrieval on a semantic memory task
o Semanticisation of episodic memory- when an episodic memory becomes a semantic memory
o Most episodic memories exhibit substantial forgetting over time except permastore memories
Describe the parts of the brain involved in procedural memory
• Procedural memory/implicit memory
o Classical conditioning effects
Controlled by cerebellum areas
o Procedural memory-motor skills, habits, tacit rules
Controlled by the basal ganglia
o Priming-implicit activation of concepts in long-term memory
Controlled by the neocortex
What is the ACT model?
• ACT model: declarative/explicit memory and procedural/implicit memory are separate structures
o Distinction between implicit and explicit memory tasks have been taken as evidence that these two types of tasks tap different memory systems
o These memory systems might reside in different parts of the brain
What is dissociation evidence?
Dissociations- Variable affects performance on one memory task but has no effect, or a different effect, on another task
What is single dissociation evidence that implicit and explicit memory systems are different memory systems that rely on different parts of the brain
Amnesia evidence: amnesia affects explicit but not implicit memory performance
Depth of encoding affects explicit but not implicit memory tasks
What is anterograde amnesia and what memory system do patients with it have trouble with?
Anterograde amnesia- cannot form new declarative memory
• Have more trouble with episodic memory than semantic, but both are severely impaired
What is retrograde amnesia, and what memory system do patients with it have trouble with?
Retrograde amnesia- amnesia for previously stored memories
• Have more trouble remembering episodic than semantic memories, but both are impaired
Describe Henry Molinais’s condition and what his capabilities were and what that meant
• Henry Molinais: bilateral removal of the hippocampal region due to his severe epilepsy caused severe memory loss
o Profound anterograde amnesia from immediately after surgery
o Retrograde amnesia, particularly for last 2 years before surgery
o Intact short-term memory (Suggesting that short term and long-term memory have an important distinction), language, knowledge of past world events (suggesting that memories are not permanently stored in the hippocampus), skill learning (suggesting that there is more than one long term memory system)
What was the initial interpretation of the hippocampus’ role in memory, and how was it proven wrong?
• Initial interpretation was that the hippocampus is not the site of either short term memory or long term memory, but required to convert experience into durable form
o But amnesic patients can acquire new procedural skills and show evidence of implicit memory, so this was an inaccurate statement
E.g. HM could learn and participate in mirror drawing
With repeated exposure to the task and practice, HM’s performance substantially improved
o Amnesic patients impaired in tests of explicit but not implicit memory
1970 Warringtom and Weiskrantz
• Group of people with amnesia were compared with group of controlled medical patients who were hospitalised and in a similar disruptive context without amnesia
• In classic test of explicit memory, people with amnesia performed substantially more poorly than controlled patient
• In implicit memory task (word fragment tests and stem completion tests), people with amnesia performed as well as controlled patients
What is a mirror drawing task?
• Mirror drawing task- Trace around shape without seeing hand directly as it is blocked by a board- instead must rely on mirror image
What is the current interpretation of explicit and implicit memory systems in amnesia
• Interpretation now is that memory is not unitary: declarative memory in amnesia does not affect short term memory, general knowledge and procedural memory
o Amnesiacs show impaired explicit memory but preserved implicit memory
• Amnesiac experiments (and HM’s results) taken as evidence that explicit memory and implicit memory depend upon different brain systems
Describe Ramponi et al’s 2010 experiment on implicit vs explicit memory
• Ramponi et al (2010)
o Experiment
Manipulated whether material that people tried to learn were pairs of words that would have emotional significance or pairs of words that were neutral
Manipulated depth of encoding by getting people to either do phonemic task (which word had the most syllables) or a semantic task (which word was more pleasant)
Manipulated type of memory test: explicit cued recall task (had to give second word of word pair when given the first word) with intervening distractor tasks vs implicit task where needed to say first word participant could think of when given first word of word pair
o Results
Emotion and depth of processing affected explicit (emotional and semantic encoding yielded better memory) but not implicit memory task
Does depth of encoding affect both explicit and implicit memory tasks?
Depth of encoding affects explicit but not implicit memory tasks
Does match between modality at encoding affect both explicit and implicit memory tasks?
Match between modality at encoding and retrieval affects implicit but not explicit
Describe Weldon and Roediger’s 1987 experiment on implicit and explicit memory
• Weldon and Roediger 1987
o Got people to study pictures or words and then tested them in free recall task (explicit memory) or in fragment completion task (implicit memory)
o For free recall, not much of difference between pictures and words, but for fragment completion task, words were remembered more than pictures
What is the limitation of single dissociation evidence?
However, single dissociation evidence could be de to task specific factors
• Major confounding factor: maybe one task is more difficult than the other for certain groups of people
Is single or double dissociation evidence stronger
Double dissociation evidence is stronger
What is double dissociation evidence?
Where a variable has opposite effects on two different types of memory tasks
What is double dissociation evidence that explicit and implicit memory systems are separate memory systems?
Depth of processing at encoding (Jacoby, 1983)
Describe Jacoby’s 1983 experiment on implicit vs explicit memory systems
o People studied words and were either just given the word by itself with no context, or given the word along with strong associate (to encourage semantic processing), or generate task where had to associate one word with another when only given the first letter of the second word (to encourage more semantic memory)
o In a recognition memory task (explicit memory task), memory was best in the generate condition, followed by the strong associate condition, followed by the no-context condition.
o However, it was the opposite for the perceptual identification task (implicit memory task) involving degraded items
What type of encoding is better for explicit memory according to Jacoby 1983
• Explicit memory better following deeper encoding
What type of encoding is better for implicit memory accordign to Jacoby 1983
• Implicit memory better after shallow processing
What are the limitation of looking at implicit and explicit memory tasks?
• Implicit and explicit memory tasks
o Different memory tasks require different processes: no memory task is process pure (nearly every task involves more than one type of process) and one type of memory may be used to complement the task of another
o Answers may be misleading: yes answer to recollection task may include recollection, familiarity or guesses
What are limitations of double dissociation evidence for implicit vs explicit memory?
- Implicit and explicit memory tasks
* Transfer-appropriate processing
Why is transfer-appropriate processing a limitation for double dissociation evidence for implicit vs explicit memory
o Memory depends on match between encoding and retrieval processes: perceptually-driven vs conceptually driven
Perceptually driven tasks encourage encoding based on activation and retrieval based on familiarity, whilst conceptually-driven tasks encourage encoding based on elaboration and retrieval based on recollection
Explicit memory tasks encourage deliberate intentional retrieval, which is best when deep semantic, elaborative processing at encoding
Implicit memory tasks encourage automatic unconscious retrieval, which is best when there is a perceptual match between study and test conditions
o Implicit and explicit memory tasks involve different retrieval processes and therefore benefit from different encoding processes
o Explicit memory tasks require conceptually driven retrieval and therefore benefit from deep, conceptually driven encoding
o Implicit memory tasks depend on perceptual familiarity of the stimulus- benefit from perceptual match between encoding and retrieval
Describe Roediger’s 2008 experiment on implicit and explicit memory
Roediger 2008 • Encoding tasks at different levels of processing o Letter number o Syllable number o Living identification o Intentional memory o Self relevance • Then paired a range of different memory tasks that tested implicit and explicit memory o Explicit Recognition Recall Semantic-cued recall Graphemic-cued recall o Implicit General knowledge quiz Word Fragment completion • Demonstrated graded effects of the overlap between encoding and retrieval processes
What is the processing account of explicit and implicit memory?
• The processing account-memory processes view
o Explicit and implicit memory are not different memory systems
o Explicit and implicit memory tasks involve different memory retrieval processes: recollection vs familiarity
What is recollection?
Recollection- process of recognising an item on the basis of retrieval of specific contextual details
What is familiarity?
Familiarity- process of recognising an item on the basis of its perceived memory strength but without retrieval of any specific details about the study episode
How can we distinguish between familiarity and recollection?
Remember/know procedure
• Ask participants to report on their subjective memory state for words they recognised
o Avoids general answer “yes”, which can be based on recollection, familiarity, or guessing
Process dissociation method
• Study task at different levels of processing
• Memory test:
o Inclusion task-measures recollection and familiarity
Complete inclusion task (such as stem completion task) with study task content and if can’t then complete without study task content
o Exclusion task- measures familiarity only as if completed with study task content or left blank, participant cannot recollect study task content
Complete exclusion task with content not seen in study task content, otherwise leave blank
Describe Konstantinou and Gardiner’s (2005) experiment
• Konstantinou and Gardiner (2005)
o Memory for faces encoded in shallow (how round the face was) or deep conditions (what the person was famous for)
o Recollection responses sensitive to level of processing, attention at encoding but familiarity responses are not
Describe Wheeler and Buckner’s 2004 neuroimaging results of recollection vs familiarity
o Neuroimaging found that two areas more active for old than new items
o Strongest activation in posterior parietal areas common to both recollection and familiarity responses
o Subset of temporal cortex specific to recollection
o Stronger activation in both posterior parietal and specific frontal areas for recollection than familiarity
o No areas that specifically respond to familiarity
-No evidence for separate MEMORY systems involved in recollection and familiarity
Describe Jacoby and Kelley’s 1992 experiment using inclusion and exclusion tests on attention
o Compared performance in stem completion inclusion and exclusion tasks for stimuli studied in full vs divided attention (had to respond whenever they heard a particular frequency tone) conditions at encoding
o Attention at encoding affected inclusion more than exclusion task
o Found that divided attention at encoding impairs recollection but not familiarity
How can exclusion and inclusion measures be used to mathematically estimate recollection and familiarity?
o Used inclusion and exclusion measures to mathematically estimate recollection (R) and automatic familiarity (A) from Inclusion and Exclusion performance
Inclusion=R+A(1-R)
Exclusion= A(1-R)
Does divided attention impair both recollection and familiarity?
o Found that divided attention at encoding impairs recollection but not familiarity
What is consolidation and what brain part plays a vital role in it?
• Consolidation- a physiological process involved in establishing long-term memories
o Hippocampus plays vital role in consolidation of memories, but over time, memories are stored in the neocortex
What is evidence of consolidation?
o Recently formed memories still being consolidated are especially vulnerable to interference and forgetting
o Evidence:
Forgetting curve
Patients with retrograde amnesia with hippocampal damage cannot remember recent memories before their amnesia
When interfering material is dissimilar to that in the first learning task, there is often more retroactive interference when it is presented early in the retention interval
What is reconsolidation?
o Reconsolidation
Can be used to update our knowledge but can causes us to misremember original information
Describe Yonelinas 2002 meta-analysis
• Yonelinas (2002) meta-analysis o Tested the change in recollection and change in familiarity in different comparative conditions: Deep-shallow processing Generation-read processing Full-divided attention Short-long duration Placebo-Benzodiazepine o In all conditions except short-long duration, recollection was more affected than familiarity
Do manipulations that encourage deeper processing affect recollection and familiarity? What does this show?
o Memory performance reflects both recollection and familiarity but all manipulations that encourage deeper processing (except stimulus duration) increase recollection more than familiarity
Shows that effects are due to what the participant is doing with the material rather than the time they have to encode the material (encoding vs time)
Deeper encoding increases recollection of previous experience more than familiarity
Automatic familiarity effect is insensitive to depth of encodings
Dissociation evidence, but due to different processes rather than different memory systems
Describe Skinner and Fernandes 2007 experiment and results
• Skinner and Fernandes 2007
o Summarised data from 12 neuroimaging studies that estimated recollection, familiarity and level of confidence
o Found that recollection strongest predictor of activation in media temporal lobe/hippocampus (left hippocampus most of all)
o Summarised data from studies that estimated recollection and familiarity in healthy samples and brain-injured patients (21 studies of patients with brain lesions)
Found that people with hippocampal damage were not able to do much recollection in contrast to familiarity
Does all evidence favour the processing view of recollection vs familiarity?
• Therefore, evidence from healthy adults favours processing view, but neural mechanisms reflect memory systems view
o However, different neural mechanisms could instead be interpreted as different processes
Why could there be a contradiction between behavioural data and neurospychological data in the debate of memory structures vs processes in recollection and familiarity?
o Neuropsychological data of brain injury often appears consistent with the systems approach
But brain injury might disrupt process involved in co-ordinating processes-not clear-cut
Neuroimaging evidence
• Makes a useful contribution when clear cognitive methods are used to design experiments and interpret patterns of activation (recollection vs familiarity)
o Need to have a theory of how knowledge is represented to validly interpret neuroimaging data
o Healthy subjects’ data tend to support the processing view
Describe the processing components approach (Cabeza 2011-2013)
o Processing components approach: investigate role of specific brain regions as a function of task goals, demands (Cabeza 2011-2013)
Argue that particular brain regions might be involved according to participant’s goal and level of motivation, as well as the demands of the task
Assumed that there are numerous processing components and that these components can be combined and re-combined for specific learning purposes
• Cognitive process: perceptually or conceptually driven
• Stimulus representation: item or relational
• Level of intention: controlled vs automatic
What are two early models of semantic memory?
o Hierarchical network model (Collins and Loftus 1972-1975)
o Associational network (Collins and Loftus 1972-1975)
What is the hierarchical network model, its support and its limitations?
o Hierarchical network model (Collins and Loftus 1972-1975)
Argued that information in semantic memory is organised taxonomically (in a hierarchy)-from general nodes to specific nodes
Associated with each different level of taxonomic hierarchy are particular properties associated with that level -properties from higher levels are inherited in lower levels (don’t need to think of them again in lower levels)
Evidence supporting hierarchical network theory: sentence verification task
• How long it took people to decide whether a sentence is true or not
• Time to make semantic decisions is a function of distance in the canary
Cognitive economy is guiding principle -efficiency
Limitations of hierarchical network theory:
• Time to make decisions influenced by typicality and ambiguity
• Knowledge unlikely to be learned as a hierarchy: how does this hierarchical information become like this through learning
What is the associational network and what is its limitation?
o Associational network (Collins and Loftus 1972-1975)
Concepts were linked within a network so that the distance between concepts and strengths of the links between concepts was a function of their similarity, shared features, how likely they were to occur together
Faster to make decisions about things that are closer together in network
However, still no information as to how that network came to be/its representation in the brain
What semantic model is currently accepted and who is it by?
• Parallel distributed Processing (PDP) models (McClelland and Rogers 2003)
Describe the Parallel distributed Processing model (McClelland and Rogers 2003)
o Computational models: computer programs inspired by neural metaphor
o Connectionist (neural) networks
Set of interconnected processing nodes (neurons) that communicate by sending activation or inhibition
A learning rule for adjusting connections throughout the network
• Structure of knowledge within the network is learned by adjusting the strength and weightings of the connections between the various network nodes
Learning in connectionist networks:
• Learning algorithms (delta rule, back propagation)
• Supervised learning (adjust connection weights to reduce error)
o Error correction learning (forward and back propagation of activation)
o Structure of knowledge stored in all these connections reflects the conceptual organisation of the concepts that the network has been trained with
Memory structure is an emergent property of the distributed dynamic processing assumptions of PDP models
Describe how forward and back propagation of activated works in the PDP model
o Forward and back propagation of activation-
Receives input (that has a particular strength and is assigned a particular weight) from the environment or from other nodes in the network
• Initial weights of the network are all random
• After many trials, more differentiation in the patterns for different concepts
That strength and weight are combined by formula to get an overall level of activation of node as a function of input coming into it, which then transmits that activation to the next layer of nodes
Eventually, the activation will reach the output layer (response)
The pattern of activation that has been generated is compared to the pattern that should have occurred and during the learning stage of the network, the discrepancy between the pattern that was generated and the correct output is then used to adjust weights in network to reduce error so that next time stimulus is affected it will yield more correct pattern
Back propagation of error to increase accuracy of networks performance
Supervised learning rule
What are fundamental assumptions of the parallel distributed processing model?
o Fundamental assumptions-
Knowledge is distributed through the system rather than localised in a single symbolic node
• Knowledge is patterns of activity between connections
• Knowledge learned by gradually adjusting connection weights until the network can reproduce the correct output for all trained inputs-> learning must be gradual
o Must only make small changes of connection weight at a time or it will disrupt all previous learning that has occurred because different nodes in the network are involved in the representation of a wide variety of concepts
• No clear distinction between structure and processes
o Learning processes give rise to structure of knowledge which are then used for retrieval processes
What are advantages of the parallel distributed processing model?
o Advantages of PDP model
Spontaneous generalisation: category generation emerges from overlap between all instances and will allow generalisation of knowledge
• However, difficult for system to learn about atypical examples in category
Graceful degradation
• Loses knowledge in gradual way- stronger effect on concepts that were more difficult to learn
• Semantic dementia-
o People show gradual loss of conceptual knowledge
o Anterior temporal lobe: Site of damage in semantic dementia
Does the Parallel Distributed Processing model diffferentiate between general and specific information? How?
o Memory for general vs specific information
No clear distinction between general and specific information
Store specific instances but other similar instances will use some of the same connections
Average activity across many separate instances will correspond to general abstract semantic knowledge of the concept
No clear distinction between types of memory representations for semantic and episodic knowledge
Why is there a need for complementary learning systems to be included in the parallel distributed processing model
o Difficulties in learning resulting from PDP model
Cannot easily learn exceptions
Poor at one-trial learning-need gradual repeated exposure to concept
Difficulty with AB-AC (retroactive interference) learning
• Learning new association will interfere with previous knowledge
These problems all arise from the need for very gradual learning of semantic memory
• This is why consolidation of long-term memories take a long time
Describe how complementary learning systems work
Distributed representations must be learned gradually, incrementally with interleaved training so that there is no catastrophic interference with previous learning
A different form of memory is required to account for initial acquisition of new episodes and explain how new episodes are consolidated into distributed long-term memory system in neocortex
• This might be the role of the hippocampus and medial temporal lobe
One of the ways in which gradual integration of new information from hippocampus to neocortex is through sleep and dreaming
Hippocampus critical in binding together new and activated information and playing back information to the neocortex to integrate and form new episodic memories
How does consolidation of long term memory occur?
o Hence, consolidation of long-term memories take a long time
Initially, memory is in the pre-frontal cortex and rehearsal is used to hold it there
Over the first few hours of experiencing a memory, there’s gradual cellular consolidation in the hippocampus but the long-term potentiation mechanisms in the cortex take hours or days and the evidence
• Retrograde amnesia explanation: more recent memories have not been fully consolidated
Describe the role of binding in both explicit and implicit memory, and how that binding occurs
• Binding and implicit vs explicit memory (Reder et al.2009)
o Implicit and explicit memory tasks rely on the same knowledge representations of concepts
o Implicit priming= activation of abstract, consolidated concept node in neocortex
o Explicit recollection requires binding between concepts and context
o Role of hippocampus/medial temporal lobe is to bind together information about the activated concept and other aspects of the experience (episode node and general context node)
Amnesia is associated with deficits in binding processes important for creating new memories and recollection processes
Similar deficits in healthy people given midazolam: temporary amnesia
Describe Henke’s (2010) view and the 3 major processes critical to long term memory activation and retrieval
• Rather than assuming there are separate systems for declarative and non-declarative memory, cognitive neuroscientists are now exploring the processes associated with different regions
o Henke (2010)- memory systems are identified on the basis of the types of processing involved
3 major processes critical to long term memory activation and retrieval
• Binding
o Hippocampus and neocortex- play critical role in rapid encoding of flexible associations (binding together information and context to make new episodic memory)
• Consolidation
o Basal ganglia, cerebellum and neocortex- play a critical role in slow encoding of rigid stable associations
o Required for procedural memory, classical conditioning and semantic memory
• Priming
o Parahippocampal gyrus and neocortex-play a critical role in rapid encoding of single or unitized items
o Required for familiarity and priming
What are the Implications of parallel distributed processing and complementary systems framework
- Distributed memory representations are constructed from individual episodes but capture abstract relationships between them
- Consolidation of memories is a gradual, incremental process
What are the two views of mental representations?
- Symbolic
- Concrete
What are symbolic models of semantic memory and their rules?
• Symbolic models of semantic memory assume abstract representations
o Assumed that they are abstract in nature and are thus detached from input and output processes
o They are stable in that any given individual uses the same representation of a concept on different occasions
o Different people generally have fairly similar representations of any given concept
Who argued that memories are not abstract? Outline his argument
• Barsalou (2009)- memories are not abstract
o Representation of any given concept will vary across situations depending on the individual’s current goals and the situation’s major features
Processing is influenced by the current concept of setting
o Mental representations of concepts include sensory/motor attributes involved in perception and action
o Representations distributed across sensory and motor neural systems (not abstract)
o Distributed representation of an object is composed of activation in all of the relevant sensory areas that were involved in interacting and learning about that object
What is Hauk et al’s evidence that memories are not abstract?
• Hauk et al (2004)
o Lexical decision responses to lick, pick, and kick (visually similar words) activated motor areas for tongue, finger and good respectively
o Evidence that memories are not abstract-supports Barsalou’s view
What is Pulvermuller et al’s evidence that memories are not abstract?
• Pulvermuller et al (2005)
o Transcranial magnetic stimulation to arm or leg motor strip: facilitated lexical decision responses to arm or leg words when TMS delivered to arm or leg motor strip respectively
What are limitations of Barsalou’s approach on concrete concepts?
• Evidence that the motor system is associated with the processing of action words, but does not show whether motor processing is necessary for understanding action concepts or whether such processing occurs only after concept meaning has been accessed
• Limitations of Barsalou’s approach
o He exaggerates how much concept processing varies across situations: the traditional view that concepts possess a stable, abstract core has not been disproved
Both theoretical approaches are partially correct- concepts have a stable core and their structure is context-dependent
Describe the Hub and Spoke model (Patterson et al. 2007)
• Hub and Spoke model Patterson et al.2007: links both abstract and concrete ideas and most comprehensive approach
o Hub consist of a modality-independent unified conceptual representation efficiently integrating our knowledge of any given concept
Assumed that hubs are located within anterior temporal lobes
Hubs provide conceptual coherence and integration of sensory and motor processing
Semantic dementia- patients lose the hub
o Spokes consist of several modality-specific regions involving sensory and motor processing
Spokes facilitate our interactions (perceptual and motor) with the world around us
What was the early view and definition of episodic memory
• Early view of episodic memory-
o Episodic memory originally defined by its content (1972)
Temporally dated in relationship to our self
o But early evidence did not really assess episodic memory
Tested what not where/when
Failed to distinguish recollection from familiarity
• Shown to be important distinguishing point of episodic memories
What is the current view of episodic memory? (what their definining features are, formation and definition)
• Revised view (Tulving 2002)
o Episodic memory is a subsystem of semantic memory but is:
Recently evolved, late development in age and early deteriorating (declines with age)
Probably unique to humans
Requires episodic retrieval mode (conscious recollection)
o Defining and distinguishing features:
Mental time travel
Autonoetic awareness
• Awareness of fact that you are remembering yourself being there
Linked to self: autobiographical
o Episodic memories require the construction of new memories that link/bind together concepts, contexts and self
The hippocampus plays a critical role in this binding process but semantic information can still be acquired through incremental, repetitive exposure
This process of binding to form new knowledge is independent of distributed semantic memory network gradually acquired through experience: episodic memory preserved in semantic dementia
o Binding is critical for consolidation, recollection and recall
What is neuropsychological evidence that episodic memory is a subsystem of semantic memory?
Semantic (fronto-temporal) dementia
• Hierarchical loss of knowledge about concepts
o General properties preserved much better as disease progresses
o Knowledge about more prototypical items preserved longer than knowledge about atypical items
o Attributes typically shared by category members are attributed to items that are exceptions
o Suggests semantic system deteriorates from the more precise to the more general, consistent with predictions of PDP models
• Other cognitive capabilities well preserved if not dependent on semantic knowledge
• Episodic/autobiographical knowledge relatively preserved despite loss of semantic information
Comparison of alzheimers and semantic dementia
• Areas of reduced metabolism much more widespread in Alzheimers
• Abnormal brain function in semantic dementia localised to anterior temporal lobe
• Performance in semantic tasks more impaired in semantic dementia than Alzheimer’s disease
o E.g. category fluency and picture naming tasks (Patterson et al. 2007)
• Alzheimers disease associated with greater impairments in episodic memory
• Suggests a dissociation between performance of semantic and episodic tasks
o But dissociations between episodic and semantic memory are NOT evince that they are stored in separate structures
Describe what is known of autobiographical memory through Gilboa 2004
• Gilboa (2004)
o Few studies of real autobiographical memory
o Recruit different frontal regions than laboratory tests of episodic although there are overlapping regions recruited during both
But not necessarily evidence that they involve different memory content or different memory retrieval processes, it’s more due to type of monitoring and decision processes that are carried out about these autobiographical memories
Due to differences in monitoring and decision processes, not memory storage or retrieval
What are current, most recent conclusions of long term memory? Includes:
- Where long term memories are
- The role of the medial temporal lobe/hippocampus
- Experience of episodic memory with working memory
• Long term memories are distributed over the same brain regions involved in perception and comprehension
o There is no special neural system that stores memory or stores different types of memory
o To understand memory, we need to understand how knowledge is acquired and organised, and the processes that make use of it
• Medial Temporal lobe/hippocampus is critical for encoding new memories
o Evidence from amnesiac patients and other sources
o Binding of concepts with context to create episodic memories and recollect autobiographical memories
o Consolidation required for durable memories in neocortex: abstract hub representations are integrated in anterior temporal lobe
• The experience of episodic memory also involves other frontal brain areas that overlap with the areas involved in working memory
o Combining cognitive, computational, neuropsychological and neuroscience approaches has helped but more research is needed