Lecture 9 : Intro to NLP Flashcards

1
Q

What is NLP?

A

Natural Language Processing –> automatic processing of written texts
1.Natural Language Understanding–>input=text
2.Natural Language Generation–>input=text

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is Speech processing?

A

Automatic processing of speech
1.Speech recognition -> input=acoustic signal
2.Speech synthesis–>output=acoustic signal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is rule-based NLP(1950-mid 1980)?

A

-Cognitive approach
-Rules are developed by hand in collaboration with linguists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is statistical NLP(mid 1980-2010)?

A

-Engineering approach
-Rules are developed automatically with ML
-linguistic features are hand-engineering and fed to ML model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is deep language processing (2010-now)?

A

-Rules are developed automatically with ML
-Linguistic features are found automatically

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the 2 limits of the Bag-of-Words model(BOW)?

A

-word order is ignored–> meaning of text is lost
-n-grams take a bit of the word order into account

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the n-gram model(2)?

A

-Probability distribution over sequences of events
-Models the order of events

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When is the n-gram model used?

A

To predict the next event in a sequence of event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What’s a language model?

A

Its a n-gram model over word/character sequences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are examples of applications of language models(4)?

A

-Speech recognition
-Statistical machine translation
-Language identification
-Spelling correction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are 2 issues with n-grams?

A

-Natural language is not linear
-There may be long-distance dependencies : syntactic, semantic, world knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the 3 stages of NLU?

A

1.Parsing (syntax)
2.Semantic interpretation
3.Contextual/world knowledge interpretation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is parsing?

A

-To assign syntactic structures to a sentence
-To determine how words are put together to form correct sentences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the goal of semantic interpretation?

A

-Map sentences to some representation of its meaning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the 2 types of semantics?

A

1.Lexical semantics–> meaning of individual words
2.Compositional semantics–>meaning of combination of words

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the word sense disambiguation(WSD)?

A

Determining which sense of a word is used in a specific sentence

17
Q

What is the goals of Naive Bayes WSD?

A

Choose the most probable sense s* for a word given a vector V of surrounding words

18
Q

What is world knowledge?

A

Using our general knowledge of the world to interpret a sentence-discourse