Week 10 Self-Supervised Learning Flashcards
What is self-supervised learning?
Self-supervised learning leverages unlabeled data by creating surrogate labels for training.
What gap does self-supervised learning bridge?
It bridges the gap between unsupervised and supervised learning.
What are common tasks in self-supervised learning?
Tasks include predicting image rotations, solving jigsaw puzzles, and colourizing grayscale images.
What is contrastive learning?
Contrastive learning learns representations by maximizing similarity between augmented views of the same image.
What are SimCLR and MoCo frameworks?
SimCLR and MoCo are popular frameworks for contrastive self-supervised learning.
How do pretext tasks contribute to self-supervised learning?
Pretext tasks help models learn features useful for downstream tasks.
How does self-supervised learning reduce costs?
It reduces the dependency on expensive manual labeling.
What are applications of self-supervised learning?
Applications include pretraining for object detection and image classification.
How does fine-tuning improve self-supervised models?
Fine-tuning self-supervised models improves performance on specific tasks.
What are challenges in self-supervised learning?
Challenges include designing effective pretext tasks and computational cost.