Week 10: coherence Flashcards

1
Q

Sentences are locally coherent when…

A

…they are all connected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is relational coherence?

A

When every proposition that is introduced is rhetorically connected to another piece of information in the discourse.
Resulting in a single connected structure for the whole discourse.

All anaphoric expressions can be resolved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is entity coherence?

A

Discourse is coherent to the extent that it is consistent in the entities that it focuses on.

Complementary to discourse coherence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Lexical coherence

A

Views text as coherent when there are clear relations between the words used in nearby sentences.

The required relations can be discrete (formal relations such as hyponymy) or graded (e.g. distributional similarity).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Rhetorical Structure Theory

A

A type of relational coherence

An approach that analyses discourse into a hierarchical structure of relations between discourse segments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In Rhetorical Structure Theory, what are relations defined between?

A

An independently interpretable NUCLEUS and a dependent SATELLITE.
(Some relations have multiple nuclei)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the first step in RST analysis?

A

Identifying the elementary discourse units.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the elementary discourse units?

A

Non-overlapping subsequences between which relations hold.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Segmented Discourse Representation Theory

A

A type of relational coherence.

Integrates rhetorical relations with compositional formal semantics.

A hearer’s goal is to find a coherent model of discourse - one in which all elements combine to form a single whole.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What type of coherence does Centring Theory use?

A

Entity based coherence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the backwards looking centre (C_b)?

A

The salient entity at U_n, this must have been mentioned in the immediately preceding utterance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the forward looking centres (C_f)?

A

The potential future salient entries.
These are ranked for likelihood of being the salient entity in next utterances.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the preferred centre (C_p)?

A

The top-ranked entity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is rule 1 of centring theory?

A

If any element C_f(U_n) is realised as a pronoun in utterance U_(n+1), then C_b(U_(n+1)) must be realised as a pronoun too.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is rule 2 of centring theory?

A

Transition states are ordered. Continue is preferred to retain, retain is preferred to smooth-shift, smooth-shift is preferred to rough-shift.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the first step in the Entity Grid Model?

A

Generate a grid in which rows are sentences and columns are the entities.
The cells are marked as “s” = subject, “o” = object, “x” = neither or “-“ = absent.

17
Q

What is the second step in the Entity Grid Model?

A

Calculating the probability of transitions between pairs of elements.

18
Q

What is a lexical chain?

A

A series of related words that span a topic unit of text.
Relatedness is defined using a thesaurus.

19
Q

When a lexical chain is found…

A

…a text is said to be cohesive.

20
Q

Limitation of lexical chains method?

A

The nature of the thesaurus.

21
Q

Distributional semantics

A

Defines the similarities between words in terms of similarity of the contexts in which the words appear across a corpus.

One such approach is called Latent Semantic Analysis.