C5W2 Word Embeddings Flashcards
Name two ways of word representation?
One hot representation and embedding representation
How to visualise N-dimensional space?
Use t-SNE algorithm
What are the advantages of word embeddings?
- You can use unlabelled data to learn the embedding(or take open source), and then use transfer learning for your task
- Word embedding can help to learn analogies
Difference between one-hot vectors and word embedding vectors?
One hot is sparse, while embedding is dense
What is a embedding matrix?
Matrix where height is a number of embeddings, and width is a size of a vocabulary
How model learns embeddings?
You need to learn model to predict target word based on context word.
What is word2vec?
Mapping from word to embedding vector
What is cosine similarity?
Function of similarity of two words based on their embedding vectors
Main advantage of word embeddings?
It allows your model to work on words that are not in train set