lesson_13_flashcards

1
Q

What is an embedding in machine learning?

A

A mapping of objects (e.g., words, nodes, images) into vectors in a continuous vector space, where proximity indicates similarity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are word embeddings?

A

Vector representations of words that capture semantic and syntactic relationships, learned from co-occurrence in text data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the distributional hypothesis in NLP?

A

The idea that words appearing in similar contexts tend to have similar meanings, forming the basis of word embeddings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Word2Vec?

A

A neural embedding model that predicts context words given a target word (skip-gram) or target word given context words (CBOW).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are graph embeddings?

A

Learned vector representations of nodes in a graph that encode structural and relational properties for downstream tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is negative sampling in Word2Vec?

A

A training technique where negative examples (unrelated word pairs) are sampled to make training more efficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are hierarchical embeddings?

A

Representations in hyperbolic space capturing hierarchical relationships, requiring fewer dimensions compared to Euclidean space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is fairness in embeddings?

A

Ensuring embeddings do not amplify or perpetuate biases present in the training data, as seen in cases like gender-biased word analogies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is PyTorch BigGraph?

A

A scalable framework for training embeddings on large graphs with billions of nodes and edges, using techniques like partitioning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is intrinsic evaluation of embeddings?

A

Evaluation based on internal properties, such as nearest neighbor quality or analogy tasks, to assess semantic and syntactic relationships.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is extrinsic evaluation of embeddings?

A

Assessment based on performance in downstream tasks like classification, clustering, or recommendation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is matrix factorization in graph embeddings?

A

A method to decompose the adjacency matrix of a graph into low-dimensional latent factors, representing nodes as embeddings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is hierarchical softmax in Word2Vec?

A

A technique to efficiently compute probabilities in large vocabularies by using a binary tree structure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is contextual word embedding?

A

Embeddings like BERT and ELMo that dynamically generate word representations based on the surrounding context in a sentence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How do embeddings enable recommendation systems?

A

By representing users and items in the same space, embeddings help predict preferences and recommend similar items.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly