C5W2 Word Embeddings Flashcards

1
Q

Name two ways of word representation?

A

One hot representation and embedding representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to visualise N-dimensional space?

A

Use t-SNE algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the advantages of word embeddings?

A
  1. You can use unlabelled data to learn the embedding(or take open source), and then use transfer learning for your task
  2. Word embedding can help to learn analogies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Difference between one-hot vectors and word embedding vectors?

A

One hot is sparse, while embedding is dense

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a embedding matrix?

A

Matrix where height is a number of embeddings, and width is a size of a vocabulary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How model learns embeddings?

A

You need to learn model to predict target word based on context word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is word2vec?

A

Mapping from word to embedding vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is cosine similarity?

A

Function of similarity of two words based on their embedding vectors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Main advantage of word embeddings?

A

It allows your model to work on words that are not in train set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly