Week 10 Self-Supervised Learning Flashcards

1
Q

What is self-supervised learning?

A

Self-supervised learning leverages unlabeled data by creating surrogate labels for training.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What gap does self-supervised learning bridge?

A

It bridges the gap between unsupervised and supervised learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are common tasks in self-supervised learning?

A

Tasks include predicting image rotations, solving jigsaw puzzles, and colourizing grayscale images.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is contrastive learning?

A

Contrastive learning learns representations by maximizing similarity between augmented views of the same image.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are SimCLR and MoCo frameworks?

A

SimCLR and MoCo are popular frameworks for contrastive self-supervised learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do pretext tasks contribute to self-supervised learning?

A

Pretext tasks help models learn features useful for downstream tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How does self-supervised learning reduce costs?

A

It reduces the dependency on expensive manual labeling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are applications of self-supervised learning?

A

Applications include pretraining for object detection and image classification.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does fine-tuning improve self-supervised models?

A

Fine-tuning self-supervised models improves performance on specific tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are challenges in self-supervised learning?

A

Challenges include designing effective pretext tasks and computational cost.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly