WordNet Flashcards

1
Q

Polysemous

A

having multiple meanings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Word sense

A

A discrete representation of one aspect of the meaning of a word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

thesaurus

A

Database that represents word senses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Antonymy

A

When two words / lemmas have two opposite meanings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

WSD (abbreviation)

A

word sense disambiguation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

word sense disambiguation

A

The task of determining which sense of a word is being used in a particular context.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How does WordNet represent different senses of the same word?

A

with superscript:
mouse1, mouse2 etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

embedding

A

a point in semantic space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

glosses

A

textual definitions for each sense, in dictionaries or thesauruses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why can a gloss still be useful, even though it is often circular and relies on real-world knowledge that we as humans have?

A

It is just a sentence, and we can compute sentence embeddings that show something about the meaning of the sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Zeugma

A

A conjunction of different readings of a word, for example ‘Air France serves breakfast and the USA’.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Synonymy

A

When two senses of two different words / lemmas are identical.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Reversives

A

A group of antonyms that describe change or movement in opposite directions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Hyponymy

A

When a word/sense is more specific, denoting a subclass of another word/sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Hypernymy

A

When a word/sense describes a less specific or superclass of another word/sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Superordinate ==

A

Hypernym

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Other word for hypernym

A

Superordinate

18
Q

What does ‘A IS-A B’ mean?

A

A is an instance of B, or B contains A.

19
Q

Meronymy

A

A part-whole relation.

20
Q

Holynymy

A

A whole-part relation. Example: car is a holonym of wheel.

21
Q

Structured polysemy

A

When the senses of a word are related semantically.

22
Q

Metonymy

A

A type of polysemy relation. When the use of one aspect of a concept or entity is used to refer to other aspects of the entity or the entity itself.

23
Q

Synset

A

The set of near-synonyms for a WordNet sense.

24
Q

Supersenses

A

The lexicographic categories in which senses are grouped in WordNet.

25
Q

Lexical sample tasks

A

Situations where we just need to disambiguate a small number of words.

26
Q

All-words task

A

A problem in which we have to disambiguate all the words in some text.

27
Q

Semantic concordance

A

A corpus in which each open-class word in each sentence is labeledwith its word sense from a specific dictionary or thesaurus.

28
Q

Most frequent sense baseline

A

Choose the most frequent sense for each word from the senses in a labeled corpus. Can be quite accurate.

29
Q

One sense per discourse baseline

A

Not generally used as a baseline, but holds better for coarse-grained senses.

30
Q

How does the 1-nearest-neighbor WSD algorithm produce contextual sense embeddings (vs<\sub>) for sense s?

A

It averages the n contextual representations vi for each of the n tokens of the sense s.

31
Q

How does the 1-nearest-neighbor WSD algorithm produce its output?

A

It chooses the sense whose embedding has the highest cosine similarity with the target word in context.

32
Q

What class of WSD algorithms does not require labeled data?

A

knowledge-based algorithms

33
Q

Lesk algorithm

A

A WSD algorithm that chooses the sense whose dictionary/gloss/definition shares the most words with the target word’s neighborhood.

34
Q

WiC (abbreviation)

A

word-in-context

35
Q

word-in-context task

A

Given two sentences with a target word in different contexts, the system must decide whether the words are used in the same sense or in different senses.

36
Q

Retrofitting / counterfitting

A

Methods that learn a second mapping after embeddings have been trained, shifting the embeddings such that synonyms are closer to each other and antonyms are further apart.

37
Q

WSI (abbreviation)

A

word sense induction

38
Q

Word sense induction

A

An unsupervised approach to WSD

39
Q

Agglomerative clustering

A

Type of clustering where each of the N training instances is initially assigned to its own cluster. New clusters are formed bottom-up by merging similar clusters.