Lecture 10 : Deep Learning for NLP Flashcards
1
Q
What are 2 things deep learning models use for NLP?
A
-Vector representation of words–>word embeddings
-Neural network structures
2
Q
What is Word2Vec?
A
-Popular embedding method
-Very fast to train
-Idea: predict rather than count + use texts from the Web(unsupervised learning)
3
Q
What are the 2 training methods of Word2Vec?
A
1.CBOW (continuous bag of words): given context words, guess the word
2.Skip-gram: given a word, guess one of its surrounding words