Visual Word Recognition 2 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

How does word frequency affect recognition?

A

The more frequently a word is used, the faster word recognition occurs (Cattell, 1886).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When was the chronoscope invented?

A

Early 1880s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How long does it take to name a letter and a word? What does this suggest?

A

It takes the same amount of time to name a letter and a word.
This suggests that we process words as a whole.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do we estimate word frequency?

A

We use a corpus based estimate.
(counting how many words are in a text)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What did Brysbaert & New (2009) find in their study about corpus from subtitles?

A

Subtitles acted as a better corpus to written sources.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Does increasing the corpus size help improve estimates? What is the optimum?

A

No. The optimum is between 30-40 million (minimum 16 million).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why is Google Books not a reliable corpus?

A

They use optical character recognition (lots of noise and errors).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How much language are people exposed to? (Brysbaert et al., 2016)

A

Both men and women speak about 16,000 words a day; so speaking and listening is about 32,000 a day.

1 year: 11.7 million a year.
20 years: 234 million.
60 years: 1.64 billion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What did Adelman et al. (2006) argue improves word recognition instead of frequency?

A

Contextual diversity improves word recognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What did Jones et al. (2012) argue improves word recognition instead of frequency?

A

Semantic diversity improves word recognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What have findings showed about the role of context in incidental word learning? (Joseph & Nation, 2018)

A

Context had no effect.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does print exposure affect word recognition?

A

Print exposure increases recognition
(reading for children in school led to better oral language, also increases students’ success in university)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Can frequency effects be extended to multi-word phrases? (Arnon & Snyder, 2010)

A

Asked participants: Is this a phrase in English?
Yes, frequency effects were found for multi-word phrases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are binomial expressions?

A

Three-word phrases formed by 2 content words of the same lexical class and a conjunction.
(eg. bride and groom, king and queen)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What findings came out of the study on binomial recognition? What does this suggest?

A

Binomial recognition is faster than the reversed recognition. This suggests that frequency effects can be extended to multi-word recognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is lexical similarity?

A

The similarity between words in the way they are written.
(eg. orthographic neighbours)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is an orthographic neighbour?

A

Number of words that can be created by changing one letter of a target word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Having many orthographic neighbours _______ word recognition. However, this depends on _________.

A

Increases, word frequency.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What would the IA model predict about word recognition of words with many orthographic neighbours?

A

It would predict that words would be slower because there would be more competition between word nodes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the Multiple Read-out Model? (Grainger & Jacobs, 1996)

A

An IA model with decision criteria.
1. Single word node activity
2. Summed activity of all active words
3. Time threshold

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What was the a new measure of orthographic similarity?

A

Levenshtein distance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is Levenshtein distance?

A

The minimum number of edit operations (substitution/deletion/insertion) between two words to turn one word into the other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is OLD20?

A

Orthographic Levenshtein distance 20.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What were the findings of OLD20 in comparison to orthographic neighbourhood size?

A

OLD20 was very good at predicting naming and lexical decision latencies. It was also facilitatory.

24
Q

What were the findings of OLD20 in comparison to orthographic neighbourhood size?

A

OLD20 was very good at predicting naming and lexical decision latencies. It was also facilitatory.

25
Q

What are the 2 kinds of word storage?

A
  1. The mental lexicon
  2. Semantic memory (word meaning) - conceptual store
26
Q

What should semantic theory of words explain?

A
  • How words relate to the world
  • Lexical ambiguity (eg. bank)
  • Set inclusion (eg. dogs are animals)
  • Antonymy (eg. long opposite of short)
27
Q

Theory of word meaning should be compatible with _________ on word meaning.

A

Psychological data.

28
Q

What is a concept?

A

Determines how things are related/categorised.

A mental representation of a particular category.

29
Q

All words have an underlying _______ but not all _______ are labelled by a word.

A

Concept, concepts.

30
Q

What is a denotation of a word?

A

The core, essential meaning of a word.

31
Q

What is a connotation of a word?

A

The secondary implications/emotional association of a word.

32
Q

What is decompositional theory?

A

Word meanings are best described in terms of sets of bivalent semantic features.

Word meaning can be decomposed into a finite set of primitives which are universal across languages.

33
Q

What are defining features?

A

Features an object must have.

34
Q

What are characteristic features?

A

Features an object commonly has.

35
Q

What are intercorrelated features?

A

Features that tend to occur together.

36
Q

What are distinguishing features?

A

Features that enable us to distinguish among things.
These hold a privileged status in semantic memory.

37
Q

What is Semantic Network Model? (Collins & Quillian, 1969)

A

Concepts are represented by nodes in a network.

Nodes are joined together by links representing relations between concepts.

The meaning of a word is determined by the place of the node in the network as a whole.

Set inclusion links mean that nets are hierarchically organised.

38
Q

What is Prototype Theory? (Rosch, 1973; 1978)

A

Concepts are centred around a representation of a prototypical member of the class.

Boundaries are more ‘fuzzy’.

39
Q

What 2 things does Prototype Theory explain?

A
  1. Why sometimes the correct classification of an object is in doubt even when its features aren’t
  2. Why some examples of a concept come to mind more easily than others
40
Q

What is a sentence verification task?

A

Subjects are asked to response true or false to sentences like:
A robin is a bird (set inclusion)
A robin has feathers (property attribution)

41
Q

What is the principle of cognitive economy?

A

Info about concepts is stored at the highest appropriate level in the hierarchy.
(eg. ‘has feathers’ stored with ‘bird’ rather than ‘robin’)

42
Q

What are 4 assumptions of the hierarchical model?

A
  1. Takes time to move through each step
  2. One step is dependent upon completion of another step
  3. Retrieval proceeds from one node in all directions at once (parallel)
  4. Avg time for a step is independent of what levels are involved
43
Q

What did findings show about set size and word recognition?

A

eg. Animal set is larger than a bird set so it takes longer to search.

44
Q

What was found about association strengths and property verification times?

A

Association strength between words can explain the differences in property verification times.

45
Q

What was found about association strengths and property verification times?

A

Association strength between words can explain the differences in property verification times.

46
Q

Why is it a problem that ‘dog and animal’ are more closely linked than ‘dog and mammal’?

A

It goes against the hierarchy; so the hierarchy is not consistent.

47
Q

What is the Spreading Activation Model? (Collins & Loftus, 1975)

A
  • These are not hierarchal
  • It is organised by semantic similarity
  • Nodes are activated and spread along strong associative links to other nodes
  • Activation is automatic
48
Q

What can Spreading Activation Model explain?

A

Semantic priming (doctor-nurse > table-nurse)

49
Q

What are more modern models of word meaning based around the idea of? What is this?

A

Distributional semantics.
Words of similar meanings are used in similar contexts.

50
Q

What are 2 count models?

A
  1. LSA (latent-semantic analysis)
  2. HAL (hyperspace analogue to language)
51
Q

What is an example of a predictor model?

A

CBOW (continuous bag of words)

52
Q

What are distributional sematics?

A

Uses large corpuses to obtain relatedness values between pairs of words. (similarity in vectors)

53
Q

What happens in LSA?

A

Count the words in a paragraph.
(word x paragraph = N x P matrix)

54
Q

What happens in HAL?

A

Co-occurrence of word pairs within a small sliding window size.
(words x words = N x N matrix)

55
Q

What happens in CBOW?

A

word2vec
- Tries to predict a target word based on a context word
- Does this by changing the connection strength in the network

eg. man is to king as woman is to ______

56
Q

Which is better at predicting reaction times? Count or predictor models?

A

Predictor models.

57
Q

What is a general problem of semantic theories?

A

What is the relation between abstract symbols and our knowledge, experiences, actions & perceptions?