Week 10: Semantic Memory Flashcards

1
Q

What is episodic memory?

A

Memory for experienced events, defined in terms of spatio-temporal relations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is semantic memory?

A

Memory for general knowledge, defined in terms of conceptual relations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Who distinguished between episodic and semantic memory?

A

Endel Tulving

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How is retrieval different for episodic and semantic memory?

A

Episodic memory retrieval is experienced as remembering an event whereas semantic memory is experienced as knowing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Name two paradigms used to study episodic memory

A

List learning paradigms (recall and recognition) and autobiographical memory paradigms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are three tasks used to study semantic memory?

A

Lexical decision task, naming task and sentence verification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What do the earliest spreading activation models describe?

A

Words are nodes in a network, with connections representing their relations; activation spreads through the network when a word is presented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a key limitation of spreading activation models?

A

They provide a network structure but lack a sense of the dynamics of the network, such as how long activation lasts or decays

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can priming effects be observed in both episodic and semantic memory?

A

By studying pairs of words (e.g. CUP-KING) and observing priming during recognition memory experiments or lexical decision tasks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the main revision in the compound cue model compared to the SAM model?

A

Multiple cues can be held in short term memory and used for recognition memory, such as using both the present word and the word from the previous trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How is retrieval in recognition memory explained in the context of SAM?

A

It involves a global similarity computation where context and item associations are multiplied together to determine overall similarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

According to distributional models, how is the meaning of a word learned?

A

Based on patterns of word usage and co-occurence, typically defined within documents

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a major problem with learning word meaning through co-occurence?

A

Synonyms rarely co-occur in natural language, leading to high similarity between associates but not between synonyms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What technique does Latent Semantic Analysis (LSA) use to extract relationships between words?

A

Singular Value Decomposition (SVD) and dimension reduction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a key success of LSA when applied to the Test of English as a Foreign Language (TOEFL)?

A

LSA scored about 64% on exam, comparable to second language English speakers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the two types of representations learned by the BEAGLE model?

A

Item vectors and order vectors

17
Q

How does BEAGLE handle word order to understand grammatical class?

A

It learns that verbs tend to be flanked by nouns, and adjectives tend to be followed by nouns

18
Q

How does word2vec differ from LSA in its training process?

A

Word2vec uses prediction in learning and prevents overlearning through negative sampling (adjusts weight only if prediction error)

19
Q

How can semantic memory representations augment episodic memory models?

A

By helping understand false memory or effects of categorised word lists

20
Q

What is an example of how episodic memory models can produce semantic representations?

A

The instance theory of semantics shows that episodic memory models can be scaled up to natural language to construct semantic representations based on context

21
Q

What is a key difference between retrieval-based and non-retrieval-based models of semantics?

A

Retrieval based models construct semantic representations as needed, whereas non-retrieval based models like LSA,BEAGLE and Word2vec learn static representations

22
Q

What might semantic memory be according to recent research?

A

A collection of episodes rather than a separate system from episodic memory

23
Q

What is the semantic contiguity effect?

A

The finding that in free recall, participants tend to follow a just recalled item with and item that is semantically similar (eg dog would often be followed by cat)

24
Q

What are some criticisms of LSA?

A

LSA lacks word order, lacks iterative learning, and confounds associative and semantic similarity