Hoorcollege 6 en 7 semantic parsing Flashcards
Semantic parsing
How do the meaning of words go together to make the meaning of a setence
* Semantic parsing as ML problem
Not good at compositionality, which is what semantics is all about
AM dependency trees
*turns problem into parsing + supertagging
* supertagging: use NN to guess best graph fragments for each word
To get correct AMR:
* Need functor-argument structure
* Don’t need order*
BILSTM
* 2 recursive neural networks (one in each direction)
LSTM
* Long Short-Term Memory” Recursive Neural Network: a kind of RNN that’s good at remembering info for a long time
Dependency parsing with a BiLSTM
Use word encodings from the supertagging BiLSTM to train another BiLSTM to predict AM-algebra operations between words
What we get from the NN:
* Supertagger: For a word in the sentence:
a prob distribution over graph constants (plus none)
* Edge model: For 2 words in the sentence:
a prob distribution over AM operations (plus none)
Type system
graphs have types, which consist of:
● open slots
● the type that each open slot wants the graph that fills it to have
- Intransitiveverb: [S]
- No open slots: []
- Transitive verb: [S, O]
- Control verb: [S, O[S]]
AM Dependency parsing
Evaluation by finding best well-typed AM dependency tree and evaluating it to a graph is
slow because type system makes parsing NP-complete and two approximate unpruned
chart parsers still slow
Transition-based parsing
Transitioned-based parsing is faster than chart parsing: draw all outgoing edges of node,
determine graph constant, recursively continue with all children
* top down parsing
* no projectivity constraints
* Run-time is O(n^2)
Lexical types and term types
- Lexical type: type of the graph constant.
- Term type:
type of graph that the subtree evaluates to.
Theoretical guarantees
- Soundness: Every derived AM dependency tree is well-typed.
- Completeness: All well-typed AM dependency trees can be derived.
- No dead ends: Every parser configuration can be completed.