ch 7 slide deck Flashcards
the atkinson and shiffrin model of memory stoarge
-
Why do people
overestimate the duration
of a lightning strike?
The individual sees an afterimage
of the original sensory input.
Duration of this afterimage = 0.2
to 0.3 seconds
- Duration of physical
stimulus = 0.05s
sensory memory : function, capacity, duration
holds info long enough to be processed for basic physical characteristics
can hold many items at once (large capacity)
duration - very brief retention of images (0.3 sec for visual info, 2 secs for auditory info
short term memory
store info for 20-30 seconds - after the info (sounds images or words) are either committed to long term memory or lost all together
Capacity of Short-Term Memory
Short-term memory is also limited in the number of items it can hold.
* On average, people can hold 7 items (or 7 chunks of information) in
short-term memory.
* The range is 7 ± 2
George Miller published a paper in 1956 with this title:
The magical
number seven, plus or minus two: Some limits on our capacity for
processing information
other names for sensory memory
iconic memory
after image
another name for short term memory
working memory
slide 10 diagram
slide 11 diagram
free recall
Free recall: subjects are free to recall a list of items in any order
serial recall
Serial recall: subjects are to recall the list of items in their original
order of presentation
Major difference between the two recall patterns
In serial recall: subjects have good memory for the beginning of the
list, performance is poorer toward the end of the list.
In free recall, subjects have good memory for both the beginning and
the end of the list (this is called the serial position effect).
primacy effect
people have a good memory for items at the beginning of a list (reflects long-term memory)
recency effect
people alos have a good memory for items at the end of a list (reflects working memory)
Long-Term Memory
Long-term memory is an unlimited capacity store that can hold information
over lengthy periods of time (i.e., it has infinite capacity and duration)
One viewpoint assumes that information is never lost from LTM. If you
cannot remember something, that thing is
lost in memory, not lost from it
Failing to find something does not mean that the thing has vanished (or
does not exist)
- This is the retrieval failure viewpoint
Another viewpoint assumes that information in LTM may decay, and hence,
is
lost from it over time. This is the decay viewpoint.
A recall test requires participants to reproduce information on their
own without
any cues
E.g., subjects learn 25 nonsense syllables. A recall test (also called free recall
test) asks the subjects to recall as many of the syllables as they can
remember
A recognition test requires participants
o select previously learned
information from an array of options.
* E.g., subjects learn 25 nonsense syllables and then see a list of 100 nonsense
syllables. The test asks the subjects to identify the syllables that they have
learned earlier (also called old/new recognition test).
What evidence suggests that information is
never lost from memory?
- Relearning of the forgotten information is faster than first-time
learning. This is not possible if the information has been lost from
memory
A relearning test asks subjects to memorize information a second
time to determine
how much time or effort is saved by having learned
it before
- Retention is measured as a saving score:
E.g., It takes you 20 minutes to learn a list of words the first time. It takes you
only 5 minutes to relearn it a week later. Saving = 15 minutes
* Saving score = 15/20 = 75% (you have retained 75% of the information)
what type of encoding goes with shallow processing
structural encoding
- emphasizes the physical structure of the stimulus
intermediate processing goes with which encoding level
phonemic encoding emphasizes what a word sounds like
what type of encoding goes with deep processing
semantic encoding
- emphasizes the meaning of verbal input
is the word wrritten in capital letters addresses which type of encoding
structual
Levels-of-processing (also called depth of processing) theory proposes
that deeper levels of processing result in
longer-lasting memory
codes
chunking study
A study by Chase & Simon (1973)
Participants: Three chess players at different chess playing levels
(master, intermediate, and beginner).
All players were given 5 seconds to study the positions of chess
pieces taken from 1) actual games; 2) random arrangements.
The players then reproduced the chess positions from memory.