Parsing and Generation Flashcards

1
Q

Generative Grammar

A

A formally specified grammar that can generate all and only the acceptable sentences of a natural language.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Constituent

A

Something that is enclosed in a pair of brackets

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Weakly-equivalent grammars

A

Generate the same strings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Strongly-equivalent grammars

A

Assign the same bracketings to all strings they generate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Context-Free Grammars

A

Four components:

  1. A set of non-terminal symbols conventionally written in uppercase.
  2. A set of terminal symbols, conventially written in lowercase.
  3. A set of rules, where the left hand side is a single non-terminal and the right hand side is a sequence of one or more non-terminals or terminals.
  4. a start symbol, congenitally S, which is a member of the set of non-terminal symbols
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Empty Productions

A

Productions with an empty righthand side. Convenient to exclude these as they complicate parsing algorithms and a weakly-equivalent grammar can be constructed that disallows such empty productions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Left-associative

A

A grammar in which all nonterminal daughters are the leftmost daughter in a rule.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Right-association

A

A grammar where all the nonterminals are rightmost is right-associative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Lexical Ambiguity

A

Ambiguity arising from dual lexical entries of words e.g. words that can be treated as different types of verbs or as a verb or noun. They can fish is a classic example.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Structural Ambiguity

A

Ambiguity arising from different possible attachments of phrases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Parse tree

A

Structure of sentence in the form of a tree. Equivalent to bracketed structure but easier to read for complex cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Chart Parsing

A

Keeping a record of rules that we’ve applied so we don’t have to back track and redo work that we’ve done before. This works for parsing with CFGs because the rules are independent of context. The data structure used for recording partial results is known as a chart. Such strategies are designed to be complete. See pages 31-35 for more details.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Packing

A

Changing the daughters value on an edge to be a set of lists of daughters and to make an equality check before adding an edge so we don’t add one that’s equivalent to an existing one.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why can’t FSAs be used to model natural language syntax?

A

Centre embedding which is present in natural language is not possible with FSAs however they may only have finite levels of embedding which means FSAs may suffice. However, grammars written using finite state techniques alone are highly redundant and without internal structure, we can’t build up good semantic representations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Deficiencies in Atomic Category CFGs.

A

If simple don’t account for subject-verb agreement. We could allow for agreement by increasing number of atomic symbols. This approach doesn’t deal with subcategorisation, the lexical property telling us how many arguments a verb can have.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly